With just 10,000 qubits, Caltech and Oratomic show the way to functional quantum computers.
Caltech news
Researchers at the California Institute of Technology (Caltech) and the startup company Oratomic have discovered a new theoretical framework that drastically lowers the resources needed to build a practical, fault-tolerant quantum computer, potentially hastening the arrival of the quantum computing era. For many years, researchers agreed that to overcome the inherent flaws of quantum systems, a working machine that could solve complicated real-world issues would need millions of physical qubits. But according to recent studies, a fully functional quantum computer might be built with as few as 10,000 to 20,000 qubits, possibly putting these devices online by the end of this decade.
Quantum Fragility’s Challenge
Quantum computers employ subatomic particles to compute. Classical computers utilize bits (1s and 0s), whereas quantum computers use qubits, which may be in superposition and entangled at great distances. Quantum machines can solve problems that even the most powerful supercomputers cannot.
These quantum states are extremely fragile. Calculation mistakes may result from qubits’ quantum states collapsing as they interact with their surroundings. To verify and fix these errors, scientists employ a technique called quantum error correction, which involves adding additional, redundant qubits. A single “logical” qubit, the stable unit that actually does a calculation, often requires about 1,000 physical qubits using existing standard protocols, such as surface codes. A computer would need a minimum of 1,000 logical qubits, or a total of 1 million physical qubits, to execute a viable algorithm. It is still a huge engineering problem to scale a machine to that magnitude.
An Innovative Efficiency Architecture
According to a recently released research, the Caltech and Oratomic team’s innovation suggests a “ultra-efficient” error-correction design that drastically reduces this demand. The scientists discovered they could encode a logical qubit with as few as five physical qubits instead of 1,000 by utilizing the special characteristics of neutral atom qubits. This can result in a two-order-of-magnitude decrease in qubit counts. The effectiveness of this is actually somewhat unexpected, according to Caltech physics professor Manuel Endres.
In contrast to other well-known platforms like trapped ions or superconducting circuits, neutral atom systems physically manipulate and arrange atoms into arrays using optical tweezers, which are very concentrated laser beams. Neutral atom qubits may be directly coupled across great distances, in contrast to previous quantum computing systems, Endres says. One atom may be moved to the opposite end of the array and immediately entangled with another atom using optical tweezers. High-rate codes, where each physical qubit may engage in several logical qubits concurrently rather than being limited to connecting just with its near neighbors, are made possible by the dynamic capacity to rearrange qubits.
You can also read QuantX Labs Launch the World’s First Orbital Optical Clock
Global Security Consequences
Although the scientific community sees this as a victory, the expedited schedule raises concerns about digital security. Encryption techniques like RSA and ECC, which rely on mathematical puzzles that traditional computers are unable to solve, safeguard the majority of global financial transactions and private conversations.
Peter Shor created an algorithm in 1994 that demonstrated these codes could be cracked by a strong enough quantum computer. New study suggests the window for converting to “quantum-safe” encryption standards is closing faster than expected. Previously, such a computer was expected in 10–20 years. Companies must deploy quantum-resistant encryption standards quickly, according to the authors.
You can also read Quantum Navigation News: UK’s 2035 Quantum Rail Vision
Solving Scientific Mysteries
The possible uses of a 10,000-qubit processor go well beyond encryption. These computers are particularly well-suited to solving problems in chemistry, health, and sustainability since nature is fundamentally quantum. According to researchers, they may result in advances in machine learning, room-temperature superconductivity, and even basic physics issues like quantum gravity. Hsin-Yuan (Robert) Huang, Caltech assistant professor and Oratomic CTO, says, I always assumed theoretical research on large-scale quantum algorithms was only relevant in the far future. They might materialize in the next few years, according to our latest research.
You can also read Novera QPU Boosts Quantum Abilities at USask quanTA Center
The Road Ahead
It has already begun to go from theoretical discovery to practical reality. The biggest qubit array ever made, with 6,100 trapped neutral atoms, was just put together by Endres and his associates. Scaling these arrays even further while keeping error rates low is the next stage.
The researchers established Oratomic, a start-up led by CEO Dolev Bluvstein, to advance this technology into the commercial and utility sectors. Together with Caltech’s Advanced Quantum Computing Mission, Oratomic will construct the first utility-scale, fault-tolerant quantum computers in history. John Preskill, the Richard P. Feynman Professor of Theoretical Physics at Caltech, says, “I’ve been working on fault-tolerant quantum computing for longer than some of my coauthors have been alive. Now at last we’re getting close.”
“Now it’s time to build the machines,” says Bluvstein, reflecting the team’s sense of urgency.
You can also read ETH Zürich News: A 10-Year AI & Quantum Alliance with IBM