Research & Development World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE

Google’s Willow achieves 10 septillion-year quantum speedup, though more breakthroughs in fault-tolerance needed

By Brian Buntz | December 9, 2024

Willow

105-qubit “Willow” processor shown in the center. [Google]

Google’s unveiling of the Willow quantum processor could go down in history as a big moment in the pursuit of fault-tolerant quantum computing. A BBC article called the advance “mind-boggling.” And in many respects, it is. The processor completed a specialized computation in five minutes that would theoretically take today’s fastest supercomputers 10 septillion years—a timespan that exceeds the age of the universe. Perhaps even more significantly, the chip demonstrates a breakthrough in quantum error correction that researchers have pursued for almost three decades, showing for the first time that adding more qubits to the system can actually reduce errors rather than increase them.

Certainly, the achievements reported in a soon-to-be-published paper in Nature detail progress on a key challenge that has long constrained the field: quantum error correction (QEC) below threshold. This progress, however, arrives accompanied by caveats, complexities, and the understanding that many years and substantial refinements remain before practical quantum devices become a reality.

Central to Google’s claims is the demonstration of a large-scale surface code memory that reduces logical errors as the system’s size increases. The research team reports that their processor achieves a 101-qubit distance-7 surface code exhibiting a logical error rate of roughly 0.143% per cycle—a result that marks a “logical memory” surpassing the performance of its best constituent physical qubits by more than a factor of two. The results are among the first to show at scale the theoretical principles that underpin fault-tolerant quantum computing.

A horizontal bar chart compares three qubit lifetimes on a Willow chip: a median physical qubit (85±7µs), a best physical qubit (119±13µs), and a distance-7 logical qubit (291±6µs). The largest bar represents the logical qubit and includes text indicating a 2.4x coherence improvement relative to physical qubits.

Coherence times achieved on Google’s Willow chip. The distance-7 logical qubit shows a 291±6µs coherence time—over double that of the best physical qubit measured (119±13µs) and more than three times the median physical qubit performance (85±7µs).

As the preprint notes, “The logical error rate of our larger quantum memory is suppressed by a factor of Λ = 2.14 ± 0.02,” confirming that more qubits can lead to fewer logical errors under certain conditions.

As is often the case, more research is needed. The processor demonstrates below-threshold performance for a specifically selected set of operations and architectures. As Professor Alan Woodward from Surrey University notes in BBC’s coverage, one must “be careful not to compare apples and oranges.” Woodward goes on to say that Google had chosen a benchmark problem “tailor-made for a quantum computer” rather than demonstrating “a universal speeding up when compared to classical computers.”

Time will tell whether the development represents the “best quantum processor built to date” as Hartmut Neven, leader of Google’s Quantum AI lab, put it. Or if the announcement is more of a measured step forward.

Beyond the raw numbers, the Willow processor’s system-level improvements shine a light on what it will take to build practical quantum machines. To achieve the reported performance, Google’s team used a distance-7 surface code memory composed of 49 data qubits, 48 measure qubits, and 4 additional leakage removal qubits, stabilizing the system against a variety of error sources. The researchers highlight that detection probabilities increase with code size owing to finite-size effects and parasitic qubit couplings. Even though the logical qubit surpasses its physical constituents, the hardware remains far from the error rates required for extensive fault-tolerant computation. Current state-of-the-art gate fidelities, on the order of 99.9%, still pale in comparison to the <10^-10 error rates envisioned for many advanced quantum algorithms.

Moreover, the study identifies rare but significant error bursts and correlated errors that can cause logical failures roughly once every few billion cycles. Additional challenges include integrating real-time decoding strategies that process error-correction information on millisecond—or even microsecond—timescales. The research team’s paper stresses that “fully fault-tolerant quantum computing requires error rates well below those displayed by Willow.”

Related Articles Read More >

Will consumer AI hardware be the next R&D battlefield?
Detailed view of a PCR testing kit for SARS-CoV-2 with an epidemiologist in protective gear analyzing samples to detect specific viral areas causing COVID-19 pneumonia --ar 16:9 --style raw --v 6.1 Job ID: 7e698e8b-3ac0-4058-9c69-cb803819f39e
When data goes missing: How poor data management can undermining research reproducibility
OpenAI spends $6.5 billion on Jony Ive-founded startup io
H100 image from NVIDIA
After reportedly pursuing Shanghai R&D site, Nvidia calls U.S. GPU export controls a ‘failure’
rd newsletter
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, trends, and strategies in Research & Development.
RD 25 Power Index

R&D World Digital Issues

Fall 2024 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R&D magazine today.

Research & Development World
  • Subscribe to R&D World Magazine
  • Enews Sign Up
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • Global Funding Forecast

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE