Engineers at the University of California San Diego have developed a high-throughput computational method to design new materials for next generation solar cells and LEDs. Their approach generated 13 new material candidates for solar cells and 23 new candidates for LEDs. Calculations predicted that these materials, called hybrid halide semiconductors, would be stable and exhibit…
Quantum Cloud Computing With Self-check
Many scientists are currently working on investigating how quantum advantage can be exploited on hardware already available today. Three years ago, physicists first simulated the spontaneous formation of a pair of elementary particles with a digital quantum computer at the University of Innsbruck. Due to the error rate, however, more complex simulations would require a…
Physicists Create Prototype Superefficient Memory for Future Computers
Researchers from the Moscow Institute of Physics and Technology and their colleagues from Germany and the Netherlands have achieved material magnetization switching on the shortest timescales, at a minimal energy cost. They have thus developed a prototype of energy-efficient data storage devices. The paper was published in the journal Nature. The rapid development of information…
Research Group Uses Supercomputing to Target the Most Promising Drug Candidates From a Daunting Number of Possibilities
Identifying the optimal drug treatment is like hitting a moving target. To stop disease, small-molecule drugs bind tightly to an important protein, blocking its effects in the body. Even approved drugs don’t usually work in all patients. And over time, infectious agents or cancer cells can mutate, rendering a once-effective drug useless. A core physical…
Argonne Scientists Further Physics With Supercomputing, Prepare for Delivery of First U.S. Exascale System
James Proudfoot regularly shares his technical expertise for significant global scientific endeavors. To date, his efforts have supported many projects at Fermilab, his current home base at Argonne National Laboratory, and CERN’s Large Hadron Collider Project (LHC). At CERN in 2012, Proudfoot assisted the team in their successful pursuit of the Higgs Boson. Today, he…
Quantum World-First: Researchers Reveal Accuracy of Two-Qubit Calculations in Silicon
For the first time ever, researchers have measured the fidelity—that is, the accuracy—of two-qubit logic operations in silicon, with highly promising results that will enable scaling up to a full-scale quantum processor. The research, carried out by Professor Andrew Dzurak’s team in UNSW Engineering, was published today in the world-renowned journal Nature. The experiments were performed…
Researchers Train a Neural Network to Study Dark Matter
As cosmologists and astrophysicists delve deeper into the darkest recesses of the universe, their need for increasingly powerful observational and computational tools has expanded exponentially. From facilities such as the Dark Energy Spectroscopic Instrument to supercomputers like Lawrence Berkeley National Laboratory’s Cori system at the National Energy Research Scientific Computing (NERSC) facility, they are on…
Generating High-quality Single Photons for Quantum Computing
MIT researchers have designed a way to generate, at room temperature, more single photons for carrying quantum information. The design, they say, holds promise for the development of practical quantum computers. Quantum emitters generate photons that can be detected one at a time. Consumer quantum computers and devices could potentially leverage certain properties of those photons as…
NCSA Scientist Employs Supercomputer Simulations in Ohio Gerrymandering Case
Computing Faster With Quasi-particles
Majorana particles are very peculiar members of the family of elementary particles. First predicted in 1937 by the Italian physicist Ettore Majorana, these particles belong to the group of so-called fermions, a group that also includes electrons, neutrons and protons. Majorana fermions are electrically neutral and also their own anti-particles. These exotic particles can, for…
Researchers Take a Step Toward Light-based, Brain-like Computing Chip
A technology that functions like a brain? In these times of artificial intelligence, this no longer seems so far-fetched—for example, when a mobile phone can recognize faces or languages. With more complex applications, however, computers still quickly come up against their own limitations. One of the reasons for this is that a computer traditionally has…
New Research Will Serve ORNL’s Mission in Computing, Materials R&D
Energy Secretary Rick Perry, Congressman Chuck Fleischmann and lab officials this week broke ground on a multipurpose research facility that will provide state-of-the-art laboratory space for expanding scientific activities at the Department of Energy’s Oak Ridge National Laboratory. The new Translational Research Capability, or TRC, will be purpose-built for world-leading research in computing and materials…
Shaping the Future of Finance With HPC
NCSA Faculty Fellow and Assistant Professor at the Gies College of Business at the University of Illinois Mao Ye’s research lies at the intersection of big data, high-performance computing and the economics, and finance realm. Using computing resources, Ye tackles large amounts of data currently being collected by companies and finance institutions. “The high-performance computing is more like…
Berkeley Lab Highlights ‘the Little Computer Cluster That Could’
Decades before “big data” and “the cloud” were a part of our everyday lives and conversations, a custom computer cluster based at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) enabled physicists around the world to remotely and simultaneously analyze and visualize data. The Parallel Distributed Systems Facility (PDSF) cluster, which had served…
Scientists Develop Software to Balance Data Processing Load in Supercomputers
XSEDE Supercomputer Simulations Help Combat Tuberculosis Granulomas
Physicists Set a New Record of Quantum Memory Efficiency
Like memory in conventional computers, quantum memory components are essential for quantum computers—a new generation of data processors that exploit quantum mechanics and can overcome the limitations of classical computers. With their potent computational power, quantum computers may push the boundaries of fundamental science to create new drugs, explain cosmological mysteries or enhance accuracy of forecasts…
Developing a Model Critical in Creating Better Devices
Water is everywhere. Understanding how it behaves at an intersection with another material and how it affects the performance of that material is helpful when trying to develop better products and devices. An undergraduate researcher at Virginia Tech is leading the way. Chemical engineering junior Preeya Achari has now developed and recently published as first author a…
New Robust Device May Scale up Quantum Tech, Researchers Say
Researchers have been trying for many years to build a quantum computer that industry could scale up, but the building blocks of quantum computing, qubits, still aren’t robust enough to handle the noisy environment of what would be a quantum computer. A theory developed only two years ago proposed a way to make qubits more…
Determining the Importance of Connections in Unstructured Data
Discovering information in unstructured data is a major research topic today because it underlies everything from web-search engines to finding recommendations for movies and restaurants. At the University of Texas at Austin, Dr. Keshav Pingali, the W.A. “Tex” Moncrief Chair of Grid and Distributed Computing, leads a team of scientists who use supercomputers at the…
Artificial Intelligence and Deep Learning Accelerate Efforts to Develop Clean, Virtually Limitless Fusion Energy
The Science On Earth, the most widely used devices for capturing the clean and virtually limitless fusion energy that powers the sun and stars must avoid disruptions. These devices are bagel-shaped tokamaks. Massive disruptions can halt fusion reactions and potentially damage the fusion reactors. By applying deep learning—a powerful version of the machine learning form…
Scientists Create First Billion-atom Biomolecular Simulation
Researchers at Los Alamos National Laboratory have created the largest simulation to date of an entire gene of DNA, a feat that required one billion atoms to model and will help researchers to better understand and develop cures for diseases like cancer. “It is important to understand DNA at this level of detail because we…
Drones, Supercomputers and Sonar Deployed Against Floods
An arsenal of new technology is being put to the test fighting floods this year as rivers inundate towns and farm fields across the central United States. Drones, supercomputers and sonar that scans deep under water are helping to maintain flood control projects and predict just where rivers will roar out of their banks. Together,…
World-record Quantum Computing Result for Sydney Teams
A world-record result in reducing errors in semiconductor “spin qubits,” a type of building block for quantum computers, has been achieved using the theoretical work of quantum physicists at the University of Sydney Nano Institute and School of Physics. The experimental result by University of New South Wales engineers demonstrated error rates as low as 0.043…
Using Supercomputers to Identify Synthesizeable Photocatalysts for Greenhouse CO2 Gas Reduction
Testing nearly 69,000 materials for specific properties was the challenge faced by scientists conducting extensive research on using photocatalytic conversion to reduce the greenhouse gas CO2. The main goal is to find a way to reduce CO2 into chemicals that can provide a source of clean, low-cost renewable energy. But researchers have located very few…