The Quantum Economic Development Consortium has released a 28-page report, “Quantum Computing and Artificial Intelligence Use Cases,” setting out why the two technologies should be developed in tandem and what Washington, universities, and industry can do to speed that convergence. The document distills a Seattle workshop held October 29, 2024, that pulled in quantum engineers,…
Quantum computing hardware advance slashes superinductor capacitance >60%, cutting substrate loss
Reducing performance-killing noise from chip substrates is key for advancing quantum computing. Addressing this challenge, Lawrence Berkeley National Laboratory scientists developed a practical chemical etching process that precisely lifts vital superconducting components, superinductors, just above the wafer surface. This suspension method directly targets stray capacitance and substrate-related loss channels by minimizing physical contact. The research…
Hold your exaflops! Why comparing AI clusters to supercomputers is bananas
Okay, deep breaths. Maybe you’ve heard the buzz around Google’s Ironwood TPUs, which generated at least one headline claiming its system offered a 24x performance boost over the world’s most advanced supercomputer, El Capitan. Or perhaps the news about Nvidia’s Blackwell line of GPUs, its forthcoming exaflop Vera Rubin platform, or xAI’s Colossus cluster, which…
Why IBM predicts quantum advantage within two years
Industry analysts from McKinsey to Omdia largely converge on a timeline for initial quantum advantage emerging in the next few years. While the era when quantum computers can routinely tackle large-scale challenges in fields like drug discovery and materials science might still be years away, IBM’s Quantum CTO, Oliver Dial, Ph.D., predicts the threshold of…
Aardvark AI forecasts rival supercomputer simulations while using over 99.9% less compute
A deep learning system known as Aardvark Weather offers accurate weather forecasts that are orders of magnitude quicker to generate than existing systems. Described in a Nature article (currently posted as a preprint), the system can generate predictions on four NVIDIA A100 GPUs that would otherwise take roughly 1,000 node-hours on a traditional supercomputer system…
Quantum industry sees rapid growth in 2025, report finds
According to the Quantum Economic Development Consortium’s (QED-C) “2025 State of the Global Quantum Industry” report, the global quantum technology industry is experiencing unprecedented growth and investment. The report reveals a rapidly expanding market driven by advancements in quantum computing, sensing, and communication technologies, fueled by significant public and private funding. The following is a…
Mission accomplished: NVIDIA CEO’s quantum mea culpa brings Microsoft and AWS to GTC table after market-rattling comments
NVIDIA CEO Jensen Huang hosted the company’s first-ever “Quantum Day” at its GTC conference on Thursday, bringing together executives from across the quantum computing industry just two months after dismissing the technology as something that won’t be useful for “15 to 30 years.” In the immediate aftermath of Huang’s skepticism, shares of IonQ, Rigetti Computing,…
Quantinuum joins NVIDIA’s Accelerated Quantum Research Center as founding collaborator
Quantinuum has been chosen as a founding collaborator in the upcoming NVIDIA Accelerated Quantum Research Center (NVAQC), an initiative to advance hybrid quantum-classical computing. Set to open later this year, the center will integrate Quantinuum’s System Model H2 — one of the highest-performing quantum systems—with NVIDIA’s CUDA-Q platform and the GB200 NVL72 supercomputer to enhance…
Nanodots enable fine-tuned light emission for sharper displays and faster quantum devices
Penn State and Université Paris-Saclay researchers report a new way to control light by embedding “nanodots” in ultra-thin, two-dimensional (2D) materials. The team says this precision could lead to higher-resolution screens and advances in quantum computing technologies. In a study published in ACS Photonics, the scientists demonstrated how these nanodots — tiny islands of a…
Alice & Bob reports 160-fold improvement in cat qubit error protection
The quantum computing company Alice & Bob has announced a new method for stabilizing its cat qubits. Cat qubits are a type of multi-qubit superposition that mimics the macroscopic superposition idea of Schrödinger’s cat. The company says this method can achieve up to 160 times better bit-flip error protection. The approach involves “squeezing” cat qubits…
Quantum Brilliance, Pawsey integrate room-temp quantum with HPC on NVIDIA GH200
Imagine no longer needing to stand next to a giant supercomputer to dive into quantum research. Thanks to Quantum Brilliance’s virtual Quantum Processing Unit (vQPU), you can now explore quantum computing applications from wherever you are — whether that’s a standard workstation, a remote HPC cluster, or the cloud. This advancement emulates the experience of…
Frontier supercomputer reveals new detail in nuclear structure
A team of researchers at the Department of Energy’s Oak Ridge National Laboratory has unveiled a new technique to predict nuclear properties with unprecedented precision. By harnessing the Frontier supercomputer, the world’s first exascale system, the scientists modeled how subatomic particles bind and shape an atomic nucleus — work that could open new frontiers in…
Microsoft’s Majorana 1 is ‘world’s first quantum processor powered by topological qubits’
Microsoft has unveiled Majorana 1, a new quantum chip built on what the company calls its Topological Core architecture. Engineers say the design could lead to quantum machines with up to one million qubits, a size necessary to tackle complex problems in fields like chemistry, manufacturing, and environmental sustainability. Moving beyond conventional qubits At the…
RIKEN partners with Quantinuum to develop quantum-supercomputing hybrid platform
RIKEN, Japan’s largest comprehensive research institution, has selected Quantinuum’s H1-Series ion-trap quantum computing technology for its new quantum-supercomputing hybrid platform. The collaboration will see Quantinuum install its hardware at RIKEN’s campus in Wako, Saitama, as part of a project to integrate quantum computers with high-performance computing (HPC) systems like the supercomputer Fugaku. The initiative, commissioned…
Students use machine learning to predict crime at Thunderbird Hackathon
High school students dove into the world of coding and artificial intelligence (AI) at the second annual Thunderbird Hackathon, held earlier this month. Sponsored by Sandia National Laboratories and Explora’s X Studio, the event challenged teams to create machine learning models predicting crime incidents using real data from Albuquerque’s open-data initiative. “At Thunderbird Hacks, we…
AI takes center stage at ORNL, where potential meets risk
In the early 1990s, the internet seemed poised to improve our lives by democratizing knowledge, publishing, and communication. While it did achieve many of these goals, it also introduced security risks ranging from malware to phishing. The online world of 2024 feels more like a war zone than a digital playground, “If you connect a…
This week in AI research: Latest Insilico Medicine drug enters the clinic, a $0.55/M token model R1 rivals OpenAI’s $60 flagship, and more
While OpenAI charges $60 per million tokens for its flagship reasoning model, a Chinese startup just open-sourced an alternative that matches its performance—at 95% less cost. Meet DeepSeek-R1, the RL-trained model that’s not just competing with Silicon Valley’s AI giants, but in some cases running on consumer laptops in some configurations rather than in data…
How the startup ALAFIA Supercomputers is deploying on-prem AI for medical research and clinical care
Imagine a hospital spending millions on advanced imaging equipment yet relying on decades-old computers to run the software. That paradox propelled robotics and computer vision veteran Camilo Buscaron—a former systems engineer at NVIDIA and Chief Technologist for AWS Robotics—into action. In 2023, he set out to commercialize an open-source computer vision library known as Kornia,…
R&D Market Pulse: $29B energy mega-merger, new CHIPS Act hub at ASU, and more AI restrictions on China
In this week’s R&D Market Pulse, the $29.1 billion Constellation-Calpine mega-merger promises to reshape U.S. energy, the Commerce Department awards a third CHIPS for America facility to Arizona State University, and new AI export restrictions put China on notice. Meanwhile, Elon Musk’s xAI rolls out a consumer app, BlackRock withdraws from a major climate initiative,…
Sandia Labs joins with other institutions to tackle AI energy challenges with microelectronics research
Sandia National Laboratories has partnered with leading research institutions to tackle a potential energy crisis driven by the increasing energy demands of artificial intelligence (AI) and other advanced technologies. Jeffrey Nelson, a principal investigator at Sandia, highlighted the issue’s urgency: “Computing alone is projected to consume a significant portion of the total planetary energy production…
A glimpse of the world’s top 10 most powerful supercomputers
According to the November 2024 TOP500 rankings, Lawrence Livermore’s El Capitan is the most powerful supercomputer. In this list, derived from the TOP500 supercomputer ranking, Frontier and Aurora round out the top three slots. These systems range from newly installed exascale powerhouses to long-established machines that are continuously evolving. All performance results come from the Linpack…
An overview of the late 2024 supercomputing landscape in 6 charts
Over the past couple of years, the world’s most powerful supercomputers have experienced a sizable leap in performance. The combined processing power of the computers on the TOP500 list surged from 5.24 exaflops in June 2023 to 11.72 exaflops in November 2024, representing a 123.7% increase. Meanwhile, the anticipated Colossus supercomputer from Elon Musk’s xAI—if…
R&D 100 winner of the day: Precision Photon Synchronization System
MIT Lincoln Laboratory’s Precision Photon Synchronization System represents a significant step forward in quantum networking. It offers a practical and efficient solution for enabling quantum communication across vast distances. By ensuring that entangled photons can arrive at their destinations with extreme precision, the system helps overcome one of the significant challenges in establishing global quantum…
LLNL’s computing chief on balancing AI innovation with sustainability
The AI boom is sparking a potential energy crisis. Data centers, housing the powerful GPU-enabled servers that fuel AI’s growth, are projected to consume 12% of US electricity by 2028, as Reuters recently noted. Prominent tech firms like xAI, Meta, Microsoft, and OpenAI are pouring billions into GPU-based “mega-clusters” involving 100,000 or more GPUs. In addition,…
New York Stock Exchange features IonQ Technology in quantum computing first
The New York Stock Exchange (NYSE) is showcasing the ion trap chip developed by IonQ, a quantum computing and networking firm. This is the first time a quantum computing company has been featured at the exchange. The ion trap, a foundational component of IonQ’s quantum computers, is on display in the NYSE lobby, emphasizing the…