1. The AI Gold Rush: Why Investors Are Pouring Money into the AI Economy
  2. How One Data, an AI software company, secured its Kubernetes environment using Kasten K10.
  3. Nvidia Unveils Ada Lovelace GPU Architecture, OVX Systems, and Omniverse Cloud
  4. AWS Announces Significant Bedrock Update: Expanded AI Models and Improved Flexibility for Users
  5. Combining AI and Human Expertise to Drive Growth in Banking
  6. Maximizing Security in Microsoft Azure: Advice for IT Administrators
  7. How to strike a balance between digital transformation and day-to-day operations
  8. VMware introduces developer, data, and security services for sovereign clouds.
  9. Cohesity integrates Intel’s confidential computing capabilities into Cohesity Data Cloud.
  10. How to Automate Your Infrastructure Management
  11. AI Video Platform Synthesia Secures $90M in Series C Funding, Backed by Nvidia
  12. Future-proof your business with cloud storage that’s sustainable for the planet
  13. Improving Your IT Infrastructure: The Right Time and Way to Integrate Refurbished Hardware
  14. Mining Companies Explore Repurposing Unused GPUs for HPC and AI Applications
  15. A Revenera survey reveals that insights into product usage drive the creation of better roadmaps.
  16. How Retailers Can Identify Fake AI
  17. Using Large Language Models to Predict Financial Markets
  18. The Future of Accounting: How AI is Revolutionizing Financial Operations
  19. Achieving successful digital transformation: Key lessons learned – from decisions to data.
  20. Sundar Pichai’s Interview Sparks Discussion on the Real Abilities of AI Chatbots
  21. Eseye: Ongoing Challenges in IoT Connectivity and Security
  22. Europe’s €200 Billion AI Investment: A Major Move for the Future
  23. Hammerspace Secures $56M to Redefine Data Orchestration
  24. Harnessing the Potential of Generative and Predictive AI in Marketing
  25. Apple scraps data protection tool for UK customers ChatGPT said: Apple cancels data protection tool for customers in the UK.
  26. AWS Brings the AI Heat: Project Rainier and GenAI Innovations Take the Lead
  27. Seamless Digital Strategy and Business Transformation
  28. GlobalData: Key Advances That Will Drive the Metaverse’s Success
  29. Shaping the Future of AI Systems at Meta
  30. VMware Introduces New Generative AI Tools and Strengthens Partnership with Nvidia
  31. The Alps Scientific Symposium Shines a Spotlight on AI’s Potential to Address Major Scientific Challenges
  32. Quantinuum Unveils Gen QAI: A Generative Quantum AI Framework
  33. Ververica Enhances Advanced Stream Processing Technology with the Launch of the ‘Powered By Ververica’ Program
  34. Turning Data Into Insight: Key Tips for Building a Strong Data Strategy
  35. SambaNova Unveils Next-Generation DataScale System
  36. IT leaders are increasingly opting for hybrid cloud strategies because of their flexibility, cost efficiency, and enhanced security.
  37. NSF Grants SDSC Funding for NAIRR Pilot Research on Nvidia’s DGX Cloud
  38. Cerebras Breaks Record in Molecular Dynamics with 1.1 Million Simulations per Second
  39. Edge SIM Introduced to Link IoT Devices with Cloud Providers Across 180+ Countries
  40. 5 Strategies to Optimize IT Projects
  41. Shaping Healthcare Workflows: The Impact of AI Agents Beyond Automation
  42. How Hyve Managed Hosting Guaranteed a Seamless Voting Experience at the National Television Awards
  43. A 30-Year Journey to Overnight Success: The Parallel Growth of AI and Quantum Computing
  44. Nvidia’s Compact Desktop AI Box with Powerful Unified GPU/CPU Memory
  45. The Future of AI in Medical Billing Auditing
  46. UK-based cloud consultancy Rebura has been acquired by global technology distributor Westcon-Comstor.
  47. Here are five strategies for parents to guide their children in using AI responsibly
  48. Intel Labs Unveils ‘Kapoho Point’ Board Powered by Loihi 2 Technology
  49. The Future of IoT in 2025: Digital Twins, Mesh Networks, Virtual Reality, and Beyond
  50. Transforming Insurance Premium Payments: From Legacy Systems to AI-Driven Solutions
  51. Nvidia Takes a New Approach to the Top500 List, Shifting Focus Away from GPUs
  52. N-able is growing its Technology Alliance Program to create a more open ecosystem designed specifically for Managed Service Providers (MSPs)
  53. How Sun Chemical achieved infrastructure cost savings of over 50% during acquisitions.
  54. Nurturing the Cycle of Innovation: The Role of HPC, Big Data, and AI Advancement
  55. Brian Cerchio of Losant: Unlocking the Full Potential of IoT
  56. Genesys Acquires Radarr Technologies to Enhance Customer Experience Integration
  57. 6 Common Mistakes CTOs Make When Leading Teams in the Early Stages of a Project
  58. The IT Support Guide: A Complete Resource for Understanding and Hiring IT Support for Your Business
  59. Why Content Professionals Should Fully Embrace AI Without Hesitation
  60. OpenAI Unveils New Initiative to Address Risks of ‘Superintelligent’ AI
  61. Creating custom document solutions with Fluent: A developer’s viewpoint.
  62. How AI and Quantum Computing Will Work Together: Insights from Quantinuum
  63. Raising the Bar: Why Quality and Service Excellence Are Essential for Success in Today’s Business World
  64. Google Cloud’s 2025 AI Trends: The Future of Search, Customer Experience, and Security
  65. NVIDIA GTC Highlights: The Future of Data Centers and Strategic Cloud Partnerships
  66. How AI is Transforming Scientific Research
  67. Minima and Inferrix Join Forces to Strengthen Security for Millions of Connected Devices
  68. Digital Transformation Approaches for CIOs of Midsize Enterprises
  69. The University of Texas at San Antonio has been awarded a $4 million grant by the National Science Foundation (NSF) to fund a cutting-edge neuromorphic computing initiative
  70. Is ChatGPT Losing Its Edge?
  71. NetApp has partnered with Google Cloud to enhance flexibility in cloud data storage.
  72. Countdown to Compliance: Getting the Financial Sector Ready for DORA and Responsible AI
  73. AI for Everyone: How Technology Can Promote Inclusivity
  74. DataQube Partners with NodeWeaver to Deliver Complete Edge Cloud Solutions
  75. IoT Fuels Digital Transformation
  76. NTT DATA and Google Cloud Strengthen AI Partnership Across the Asia Pacific Region
  77. Empowering IT leaders to drive business growth and future-proof the organization
  78. IT Systems in the Public Sector for Enhanced Operational Efficiency
  79. Luis Mirabal, Globalstar: How Satellite IoT Can Boost Efficiency and Cut Costs
  80. Oracle and AWS have teamed up to bring Oracle Database to the AWS cloud.
  81. The Great 8-bit Controversy in Artificial Intelligence
  82. Teradata will provide cloud analytics services to the Los Angeles Clippers and the Intuit Dome.
  83. Salesforce introduces Einstein Copilot for Tableau.
  84. Microsoft’s compact, palm-sized chip is paving the way for practical quantum computing, bringing this cutting-edge technology closer to reality than ever before.
  85. GigaIO’s New SuperNode Achieves Unprecedented AMD GPU Performance
  86. Google Playfully Pokes Fun at Nvidia’s Blackwell as It Eases TPU Competition
  87. HPE Acquires Pachyderm to Enhance Reproducibility in Machine Learning
  88. Revenera introduces a new monetization analytics dashboard.
  89. Variations in IT Service Providers Across Regions
  90. Over 40% of companies lose revenue due to technology downtime and cloud complexity.
  91. Kyndryl has partnered with Veeam to provide robust cyber resiliency solutions.
  92. GoodData announces a major update to FlexQuery, its groundbreaking analytics engine.
  93. aicas has introduced the Edge Device Portal, and is now accepting applications from pilot customers
  94. Ververica enhances its advanced stream processing technology with the launch of the “Powered By Ververica” program.
  95. AI Will Replace Jobs – And That’s a Positive Change
  96. Intel Discontinues Its ‘Blockscale’ Bitcoin Chip
  97. Generative AI Paving the Path for Local SEO
  98. How One Data Safeguarded Its Kubernetes Environment Using Kasten K10
  99. The Future of Cell Therapy: How AI is Paving the Way for New Advances
  100. Quantum and AI: Realistic Synergy or Just Hype?
  101. The Impact of Deep Learning on Finance
  102. Madoc Batters of Warner Hotels highlights that the most significant hurdle is driving meaningful change
  103. The world’s first bio-circular data center has been launched, using algae to generate energy
  104. The Growing Importance of Cybersecurity in the Age of Artificial Intelligence
  105. AI drives a nearly 30% rise in IT modernization spending, yet companies are unprepared for the data demands.
  106. Marc Andreessen Claims AI Will Be the Key to Saving the World
  107. 5 Common Digital Transformation Challenges and How CIOs Can Address Them
  108. Vonage and AWS utilize communication and network APIs to provide innovative solutions
  109. My micro wave happy sunday from Expand is too small to fit Expand.
  110. Embracing the Digital Revolution: Transforming Processing Services and Evaluating the Future
  111. Google, OpenAI, Microsoft, and Anthropic Join Forces to Launch the Frontier Model Forum for Responsible AI
  112. Fewer than 20% of IT professionals believe that cloud infrastructure fully meets their needs
  113. AI-powered clouds for achieving business goals and optimal results
  114. The DCIM software market is projected to grow to $3.63 billion by 2029
  115. Nokia Enhances IoT Capabilities for Industrial Edge Applications
  116. In EMEA, half of cloud expenses are spent on fees, yet the majority of businesses still plan to expand their cloud capacity
  117. ByteDance’s Strategy for AI Chip Access Raises Concerns About Export Control Effectiveness
  118. Particle Tachyon: Bringing the Power of Smartphones to IoT
  119. Sam Altman Confirms GPT-5 Won’t Launch This Year in Reddit AMA
  120. Google Unveils TPU v5e AI Chip Following Controversial Background
  121. Qlik introduces Data Flow to speed up the Data-to-Decisions process in Qlik Cloud Analytics.
  122. The Potato Principle: Why It’s Important for Everyone to Understand
  123. Generative AI is becoming a standard feature in cloud business models, with Azure leading the charge.
  124. Fire Safety Company Resolves Customer Service Challenges with Sabio’s AI-Driven Analytics Solution
  125. Challenges and Approaches for CIOs in the Digital Economy
  126. Leveraging AI in Property Management
  127. ROI for IoT and Edge Computing on the Rise, with Open Source Playing a Key Role
  128. A Decade-Long Market Analysis and Forecast for Liquid Cooling
  129. Nvidia has unveiled new enterprise reference architectures designed to help businesses build AI-driven operations, or “AI factories.”
  130. Transforming Finance: How Generative AI is Enhancing Operational Efficiency
  131. The Digital Operational Resilience Act: Compliance is Just the Beginning for Banks
  132. SAS extends its hosted managed services to AWS
  133. Operational Risks: Why the Banking Sector Remains Cautious About AI

On Tuesday, February 11, I participated in a panel discussion on Quantum and AI at the LEAP/DeepFest conference in Riyadh, Saudi Arabia. Below, I’ve expanded on my thoughts from the event.

Quantum computing and AI are two of the most discussed topics in the deep tech space. Whether from researchers at major corporations, academics-turned-entrepreneurs, investors, CEOs, analysts, or even social media influencers, these fields have captured the attention of many.

While we can discuss them separately, what happens when we combine their capabilities or at least use them in tandem? Let’s explore three ways they might work together.

Quantum for AI

If both AI and quantum computing are impressive on their own, why not combine them to make AI even better?

Quantum computing offers a different approach compared to classical computing. Traditional computing, which dates back to the 1940s, uses bits (0s and 1s) to process information on devices like phones, laptops, and servers. Quantum computers, however, rely on qubits, which are much more powerful. Each qubit can hold two pieces of data, and every time you add a qubit, the amount of information it can process doubles. For example, with two qubits, you can process 4 pieces of data, with three qubits, you can process 8, and so on. This exponential growth is exciting.

This has led some to claim that quantum computers will soon be able to process far more data for AI applications than classical systems can. However, there’s a catch—how do you get all that data into the qubits? Several methods exist, but they’re still quite slow. So, be cautious when you hear about “Quantum for AI” innovations, especially if they’re described as “small scale” or “prototype.” The reality is, most of the progress so far addresses smaller problems, and quantum computers aren’t yet powerful enough for commercial use.

Without error correction, qubits are extremely fragile. They only work for a very short time before they lose their data. If loading AI data takes too long, there may not be enough time left to actually process it. This makes it clear that classical AI is likely to remain more practical for many use cases in the near term.

That said, machine learning involves finding patterns in data, and quantum computing’s model could potentially help us discover new patterns or speed up the process. At this stage, it’s more about potential than actual performance. Vendors will often demonstrate “Quantum for AI” on their hardware, but I’d prefer to see a demonstration involving quantum chemistry, as that’s likely to be the first successful real-world use case for quantum computing.

AI for Quantum

I find this aspect far more intriguing at the moment. Can we use AI to help improve quantum computers?

Quantum computers rely on qubits as their basic units of information. But how many qubits do we actually need? Some experts claim that dozens to a few thousand qubits will be enough, but I believe we’ll need around 100,000 physical qubits for quantum computers to be fully functional. These qubits can be either manufactured (like superconducting or silicon spin qubits) or natural (such as trapped ions, neutral atoms, or photons).

The reason we’re so interested in quantum computing is that nature itself works using quantum mechanics. Atoms, electrons, photons, and molecules all operate within this framework, so it makes sense to emulate nature’s methods to solve tough problems. However, there’s a big challenge: while nature’s quantum systems work, they don’t always cooperate with our calculations. Noise from the environment interferes with qubits, causing errors in their operations. Think of it like a calculator that miscalculates due to electrical static, or a phone call with audio interference.

Researchers have started using machine learning to detect patterns in this noise to improve quantum systems. This is a promising area of progress, and AI could help us make quantum computing more stable and effective. We might very well use AI to improve quantum computers, which could then help us realize the full potential of Quantum for AI in the future.

Quantum and AI in the Same Workflow

In some cases, scientists in quantum computing and AI may not always be aware of how industrial processes work. While it’s essential to focus on hardware and algorithms, it’s also crucial to understand the broader context in which these technologies will be used. Instead of trying to merge quantum and AI, consider them working in separate but complementary processes.

For example, Microsoft demonstrated a workflow in 2024 that integrated high-performance computing (HPC), AI, and quantum computing for a chemistry problem. Other companies, such as IonQ and Quantinuum, have also shown similar workflows. The idea isn’t new—IBM had already discussed in 2020 that the future of computation would involve classical bits, quantum qubits, and AI neurons working together.

What Are the Timeframes?

AI for Quantum has been valuable for years and will continue to be useful as we develop quantum computing systems. Understanding how HPC, quantum, and AI can work together is becoming clearer, and we should see practical results by the end of this decade.

However, Quantum for AI is still in the early stages. Currently, it’s mainly about demonstrating that quantum systems can perform better than classical AI on small problems. I believe we’ll need large-scale, error-corrected quantum computers before this becomes a significant reality, and this will likely take another 7 to 10 years, extending into the 2030s.

As we continue to evolve quantum computing, AI will also progress. The approaches we use today might not dominate either field a decade from now, and the integration of both will need to adapt to these ongoing changes.

0 Comments

Leave a Comment