There’s a lot of buzz in the quantum computing world about running AI on quantum hardware, often referred to as QuantumAI. While the potential seems huge, there are also plenty of challenges, especially when it comes to the massive amounts of data needed to train AI models.
Quantinuum, a company specializing in trapped ion quantum computing, recently shared a blog titled “Quantum Computers Will Make AI Better.” The blog reviews the progress made in adapting AI techniques for quantum computing and suggests that even bigger changes are on the horizon.
Training models like ChatGPT requires processing enormous datasets with billions or even trillions of parameters, demanding immense computational power. This is usually spread across thousands of GPUs or specialized hardware accelerators. The environmental impact is huge—training GPT-3, for example, consumed almost 1,300 megawatt-hours of electricity, which is equivalent to the annual energy usage of 130 average U.S. homes. Despite these challenges, the push to develop even larger models is showing no signs of slowing down.
This is where quantum computing comes in. Quantum technology offers a more sustainable, efficient, and high-performance solution that could fundamentally change AI. It promises to dramatically lower costs, increase scalability, and overcome the limitations of today’s classical systems.
While the blog highlights Quantinuum’s expertise in AI, especially in adapting AI techniques for quantum hardware, it also provides a great overview for those unfamiliar with how AI tools are being adjusted to run on quantum computers.
The blog’s main focus is on converting natural language processing (NLP) techniques for use on quantum hardware. Quantinuum has made significant progress in this area, especially with its work on transforming recurrent neural networks (RNNs) into parameterized quantum circuits (PQS).
In a recent experiment, the team used a quantum RNN to perform a standard NLP task: classifying movie reviews from Rotten Tomatoes as either positive or negative. The results were impressive— the quantum RNN performed just as well as classical RNNs, GRUs, and LSTMs, using only four qubits. This is noteworthy for two reasons: it shows that quantum models can achieve competitive performance with a much smaller vector space, and it also suggests that quantum computing could lead to significant energy savings in AI development.
In another experiment, Quantinuum collaborated with Amgen to use PQCs for peptide classification, a common task in computational biology. Using the Quantinuum System Model H1, the team performed sequence classification, which is used in the design of therapeutic proteins. The results were competitive with classical methods of a similar scale. This was the first proof-of-concept application of near-term quantum computing for a task critical to therapeutic protein design, helping pave the way for larger-scale applications in this field, aligned with their hardware development plans.
The blog also briefly touches on work with transformers and tensor networks, which are important areas of AI research.
The discussion on how to combine AI and quantum computing is just beginning to take shape. While the quantum computing community is eager to avoid being left behind by the AI revolution, figuring out how to integrate the two will take time. However, like many in the quantum development community, Quantinuum believes QuantumAI will be a major and profitable breakthrough.
In conclusion, the blog points out that as quantum computing hardware continues to improve, quantum AI models may increasingly complement or even replace classical systems. By using quantum superposition, entanglement, and interference, these models could drastically reduce computational costs and energy consumption. With fewer parameters needed, quantum models could make AI more sustainable, tackling one of the industry’s biggest challenges.