
Image from IBM
Big Blue is making a bold claim. “We feel at IBM, we’ve cracked the code to quantum error correction, and it’s our plan to build the first large-scale fault-tolerant quantum computer, which we call IBM Quantum Starling, in 2029,” Jay Gambetta, vice president of IBM Quantum, announced at a recent press conference.
From trial and error to quantum precision
Why does this matter for R&D? For decades, scientists have faced a paradox across multiple fields. For instance, in drug discovery, scientists identify proteins to target while relying on trial and error to find a molecule that binds perfectly. In materials science, physicists have theorized room-temperature superconductors since the 1960s, but determining which combination of elements would actually work has proven elusive.
Such is the chasm IBM’s Starling aims to cross, which it plans on realizing in 2029. Its architecture is designed to tackle some of science’s most stubborn challenges by natively representing the quantum states of molecules. The sheer scale is staggering. “Starling will be capable of running 100 million qubit operations using 200 logical qubits,” explained Matthias Steffen, head of quantum processor technology. “To put that number in perspective, with 200 logical qubits… you need about one quindecillion of the world’s most powerful supercomputers. That’s 10 to the 48 or a one followed by 48 zeros.”
The path follows a decision to abandon the field’s dominant approach. For years, the “surface code” error-correcting approach has been the go-to for protecting quantum information. But IBM concluded it was a dead end for building at scale. “This code, when we’ve looked at it, we came to the conclusion that we were unable to engineer or yield a large-scale system with it,” Gambetta said. In the press conference, he later called it an “engineering pipe dream” as a result of the extreme manufacturing precision required.
Instead, IBM pursued a more efficient family of codes known as qLDPC (quantum low-density parity-check), detailed in a breakthrough 2024 Nature paper. This new method represents a considerable savings in resources. “In it, it showed a 90% reduction in physical qubit count to perform error correction,” Steffen noted. “This is really an amazing result.” To put that in concrete terms, protecting 12 logical qubits with the traditional surface code would demand nearly 3,000 physical qubits. With IBM’s qLDPC approach, the same task requires just 288.
Two papers, a singular vision
But that paper pointed to more of a journey than a destination. “There was a lot of work that still needed to be done,” Gambetta continued. “And what we’ve done now is outline two more papers: one paper that goes in and shows how computation can be done with this, and the second going in to show how real-time decoding can be done with that.”
These two papers, published just days before the announcement, complete the blueprint. The first, “Tour de gross,” details the modular architecture, while the second, “Improved belief propagation,” solves the real-time error correction challenge. It’s a comprehensive plan that Steffen claims is unique in the field. “As of today, no one has demonstrated a credible path to simultaneously demonstrate all of these criteria, nor has anyone shown a credible plan to do so,” he said.
The pre-print papers address the missing pieces needed to turn the efficient qLDPC codes into a working quantum computer:
- Tour de gross: A modular quantum computer based on bivariate bicycle codes: https://arxiv.org/abs/2506.03094
- Improved belief propagation is sufficient for real-time decoding of quantum memory: https://arxiv.org/abs/2506.01779
These two papers complete the technical blueprint for Starling. Yet while the 2029 timeline represents the full realization of fault-tolerant quantum computing, IBM isn’t asking the industry to wait. ‘We expect quantum advantage to be demonstrated in our systems even before we reach these fault-tolerant quantum computers,’ Gambetta said.
The first paper, from Yoder et al., lays out a roadmap for a quantum computer that is far more capable and resource-efficient for the kind of physics and chemistry simulations that underpin materials science. In other words: It moves the goalposts for practical quantum advantage significantly closer. The second paper, from Müller et al., demonstrates a practical, high-performance decoder that makes the advanced error correction feasible.
While these papers establish the theoretical foundation, IBM and its partners are already demonstrating quantum computing’s practical potential. There are already early successes. “Our partners at RIKEN have already started to show that mixing a quantum computer with a supercomputer Fugaku, they’re getting comparable results in chemistry.” In addition, RIKEN and IBM incorporated quantum computations for simulating the triple-bond breaking of molecular nitrogen and iron-sulfur clusters. ExxonMobil teams have collaborated with IBM to model maritime inventory routing on quantum devices for shipping cleaner fuels. Furthermore, Cleveland Clinic Foundation has used IBM’s SQD add-on to publish chemistry simulations.
A quantum computer you can use this year
For those who want to begin their quantum journey sooner than later, Gambetta provided a nearer-term opportunity. “Hidden in the noise of this announcement is another huge announcement, Nighthawk. Nighthawk is a new processor that we’ll be making available to our clients at the end of this year… which will effectively allow our users to run circuits that use 15 times more gates.”
For R&D professionals, Nighthawk may represent an immediate opportunity to begin tackling those seemingly intractable problems, from protein-drug interactions to novel material properties, with quantum algorithms that were previously too complex to run on classical hardware.