University of Twente News
As billions of dollars pour into the global race for quantum supremacy, a significant hurdle has remained: the sheer physical scale and cost of the hardware required to make these machines reliable. Frank Somhorst, a PhD candidate at the University of Twente (UT), has presented a potential solution that could fundamentally change the architecture of future quantum computers. By developing a method to “clean” photons before they are used in calculations, Somhorst’s research demonstrates that the number of photons required for a single logical qubit can be reduced by a factor of at least four.
The Challenge of Scalability
Building quantum computers requires a great deal of precision, yet they represent a paradigm shift in processing capacity. Light particles called photons are frequently chosen as the building blocks for these systems due to their great speed and remarkable stability when passing through integrated semiconductors. They are perfect for creating scalable systems that can function in intricate optical circuits or at room temperature because of their characteristics.
However, there is a lot of “waste” in the current state of photonic quantum computing. A single dependable computing component frequently needs hundreds of photons under the present paradigms. As Somhorst puts it, “You build a state-of-the-art machine, but under the hood you’re throwing away huge amounts of light to correct mistakes” shows how ineffective the current strategy feels. Due to the requirement for extensive redundancy and error correction, scaling up is difficult and prohibitively costly.
You can also read Gravitationally Induced Entanglement And Quantum Gravity
Tackling “Imperfect” Photons
The quality of the light itself is the primary cause of the issue. Reliability in quantum computations requires that the photons employed are indistinguishable. Even though two photons could seem the same at first glance in a scientific setting, they frequently have subtle differences that “throw a spanner in the works.” A photon that arrives a little early than its counterpart or has a tiny frequency deviation are examples of these inconsistencies. The system has to put in a lot of effort to fix the problems caused by even these small changes after the fact.
A change in approach is necessary for Somhorst’s breakthrough. His approach addresses the issue at its root rather than allowing defective photons and depending on significant hardware overhead to correct mistakes afterward. Prior to it entering the primary computation step, he created a unique optical circuit that is intended to “clean up” the light. This circuit creates a single, high-quality photon by combining multiple “imperfect” photons and choosing only the best state from them.
According to Somhorst, “instead of continuously correcting errors after the fact, we first improve the quality of the light itself,” since a higher-quality input inevitably results in a lower requirement for downstream error correction.
Radical Efficiency Gains
There are significant hardware design ramifications. Somhorst’s method creates a single “clean” photon by combining several imperfect ones, but the system’s overall efficiency rises so much that the total number of photons falls off drastically. The number of photons required per logical qubit might be reduced by four times, according to a conservative estimate based on his models.
This factor of four, according to Somhorst, is probably a “lower bound” “We deliberately took a conservative approach,” he says, implying that the efficiency improvements may be even greater than currently anticipated when these techniques are implemented on a broader scale. Reducing photon requirements could simplify quantum computers, resulting in less hardware and error-correction procedures. This shift enables the transition from experimental prototypes to scalable, realistic systems.
You can also read Scientists Resolve Fluid Gravity Correspondence with Quantum
From Twente to NASA
His doctoral research eventually progressed from a theoretical investigation to the field of actual hardware. To test his “photon filtering” technique on an integrated photonic processor, Somhorst worked with the Twente-based company QuiX Quantum. Due to the tests’ success, the University of Twente has submitted a patent application for the technique.
Moreover, foreign space agencies have taken notice of the study. Somhorst started working with NASA, and part of that partnership was a presentation of his findings at the NASA Ames Research Center. “Surreal,” he said, was the moment he realized his work “could have a real impact” after witnessing a theoretical notion from Twente make its way through NASA’s corridors.
Recognition and Future Outlook
Its value has been acknowledged by academics. The journal Physical Review Applied recognized Somhorst’s research as a “Editor’s Suggestion” in its publication.
Somhorst will defend his PhD thesis, “Perfect Photons with Imperfect Photonics for Quantum Information Processing,” on 2026. He conducted research at the University of Twente’s Faculty of Applied Sciences and MESA+ Institute, namely in the Adaptive Quantum Optics group.
The potential for “smarter” photon quality could be crucial in enabling photonic quantum computers to become more compact, more reasonably priced, and eventually a part of our technological future.
You can also read Introduction To Quantum Gravity: Challenges & Emerging Ideas