A cross-institutional partnership led by Quantum Elements Inc. and Amazon Web Services (AWS) has announced a breakthrough in simulating quantum error correction (QEC), which is a significant step toward the implementation of fault-tolerant quantum computing. The group unveiled a new “digital twin” platform on 2026, which can execute hardware-faithful simulations of a 97-qubit system in around an hour. This achievement was previously thought to be computationally impossible for classical hardware.
Experts from Harvard University and the University of Southern California (USC) participated in the study, which tackles a major obstacle in the quantum industry the discrepancy between theoretical error models and the messy, “noisy” reality of real quantum gear.
The Challenge of Realistic Simulation
Quantum Error Correction (QEC) is necessary for quantum computers to carry out practical tasks. To shield data from outside noise, a single “logical” qubit is encoded into several “physical” qubits. Determining the precise number of physical qubits required and the level of hardware quality required to produce a stable logical qubit, however, is an engineering difficulty.
Simulations that simulate coherent and correlated noise complex error mechanisms like crosstalk and phase-sensitive drifts that arise in real devices are necessary to provide answers to these problems. Researchers have traditionally used “Clifford” simulators, such as Stim, which are quick but frequently need to simplify noise into stochastic Pauli flips (a process known as Pauli whirling). The very factors that deteriorate performance in actual tests are often overlooked by these simplifications.
An accurate “open-system” simulation using a quantum master equation is the alternative, but it is considerably too demanding. Even the biggest supercomputers in the world cannot track a density matrix with 497 entries for a 97-qubit system in an exact simulation.
You can also read QuSecure Inc & NIST NCCoE team up to Quantum Cyber threats
Scaling the “Digital Twin” via Quantum Monte Carlo
The team used a real-time Quantum Monte Carlo (QMC) technique created at USC by physicists Tong Shen and Daniel Lidar to get over this “exponential wall.” By using a “population of walkers” to stochastically compress and update the density matrix, this approach preserves the accuracy of the noise features while exchanging deterministic certainty for controllable statistical error ranges.
Using AWS ParallelCluster-managed Amazon EC2 Hpc7a instances, the team put this strategy into practice. They simulated a single syndrome-extraction round of a distance-7 rotating surface code which consists of 49 data qubits and 48 measurement qubits in roughly 75 minutes on a single compute node by taking advantage of the parallel nature of QMC updates.
Experimental Benchmarks and Findings
A hardware-motivated transmon noise model was the main focus of the simulation. Among the important parameters were:
- Decoherence: T1 and T2 times between 150 and 300 μs.
- Crosstalk: 20–100 kHz residual ZZ crosstalk between couplings.
- Control Errors: Control errors include two-qubit 50 ns sigmoid pulses and single-qubit 25 ns Gaussian pulses with a 0.1% under-rotation to mimic miscalibration.
By sweeping gate drive frequencies (detuning), the researchers compared their digital doppelganger to the conventional Clifford/Stim model. The QMC digital twin showed a spatially patterned syndrome-extraction bias, whereas the Stim model expected a uniform response. Because it reveals how control-parameter misalignments produce mistake patterns that simpler models are unable to identify, this structural bias is crucial for developers.
You can also read Kvantify Unveils Qrunch 1.1 To Accelerate Quantum Chemistry
A Bridge to Fault Tolerance
According to the statement, “These results make hardware-faithful noisy circuit simulation practical at experiment-relevant scales that were previously out of reach.” Teams are able to create realistic syndrome datasets because these simulations may be repeated in large quantities. These datasets are crucial for training “expressive” decoders, such as neural network-based ones, which are better able to decipher the intricate error patterns of real-world hardware.
A “who’s who” of quantum research is represented by the project’s crew. Tong Shen, an expert in quantum many-body simulation, Sebastian Hassinger, Tyler Takeshita, and Daniel Lidar, the Viterbi Professor of Engineering at USC and CSO of Quantum Elements, are among them.
Looking Ahead
Quantum Elements intends to use this technique as the basis for a platform that emphasizes system co-design, in which error-correction software and hardware layouts are built together. The digital twin will be updated in the future with even more complex error models and a direct link between simulated improvements and quantifiable hardware performance increases.
Accurately predicting hardware behavior through conventional “twins” could be crucial to accelerating the development of a functional, error-corrected machine as the industry advances toward “useful” quantum computing.
You can also read IBM Quantum Starling exceed current supercomputers by 10⁴⁸