The Noisy Intermediate-Scale Quantum (NISQ) era, which was first used in 2018 by physicist John Preskill to characterise the quantum processors of the time and, in fact, those that are still in use today, presently defines the field of quantum computing. Although these devices may perform quantum processes, their overall capabilities are limited by considerable noise and mistakes.
What is NISQ Era?
Although some newer variants have up to 1,000 qubits, NISQ devices usually feature tens to a few hundred. Atom Computing’s 1,180-qubit quantum processor, for example, achieved the 1,000-qubit milestone in October 2023. IBM’s Condor also has more than 1,000 qubits, but as of 2024, sub-1,000 processors are still the standard.
You can also read PyQBench: Quantum Noise-based Qubit Fidelity Benchmark
Characteristics
Among the key characteristics of NISQ systems are:
- Limited Coherence: Qubits don’t stay in their quantum states for very long.
- Noisy Operations: Errors can occur in quantum gates and measurements because of hardware flaws and ambient noise. These computers are vulnerable to quantum decoherence and sensitive to their surroundings.
- Lack of Error Correction: Currently, NISQ devices are unable to continually identify and repair faults during circuit execution due to a lack of resources necessary for comprehensive quantum error correction.
- Hybrid Algorithms: NISQ algorithms frequently use a hybrid strategy to overcome these limits, using classical computers to do some computations and make up for the limitations of the quantum device.
You can also read What is QML? How Can QML Serve as a Tool to Strengthen QKD
Current Status and Difficulties
Present Situation and Difficulties Commercially accessible quantum computers still have high mistake rates, even though the field of quantum computing has progressed beyond strictly lab-based activities. A ‘quantum winter’ for the business has been predicted by some experts due to this inherent fallibility, while others think the technical difficulties would keep the sector limited for decades. NISQ machines are often no more effective than traditional computers at addressing general issues, notwithstanding advancements.
The following are some significant NISQ technology limits and disadvantages:
- Error Accumulation: The depth and complexity of quantum circuits that may be successfully implemented are limited by the rapid accumulation of errors.
- Limited Algorithmic Applications: NISQ devices are unable to supply fully error-corrected qubits, which are necessary for many proposed quantum algorithms to function at a practical scale.
- Scalability Issues: It is still very difficult to increase the number of qubits while preserving or enhancing their quality.
- High Cost and Complexity: NISQ device construction and maintenance are costly and need specialist infrastructure, such as cryogenic systems.
- Uncertain Path to Quantum Advantage: For real-world applications, it is currently uncertain if NISQ computers can provide a definite and useful quantum advantage over the best conventional algorithms. Generally speaking, issues intended to be challenging for conventional computers but lacking immediate practical applications have been the focus of demonstrations of quantum supremacy, such as Google’s 53-qubit Sycamore processor in 2019.
Developments
New Developments and Exciting Uses Significant progress is being made in spite of the obstacles. Enhancing qubit performance, lowering noise, and creating better error correction techniques are all areas of ongoing research. Google has shown that Quantum Error Correction (QEC) is not only theoretically sound but also practical. In April 2024, Microsoft researchers reported a significant decrease in error rates using just four logical qubits, indicating that large-scale quantum computing may be possible sooner than previously believed.
The development of dynamic compilation strategies that make quantum systems easier to use, as well as innovations in supporting systems like cryogenics, optics, and control and readout, are what propel advancements, according to Chris Coleman, a Condensed Matter Physicist and consultant to The Quantum Insider.
You can also read A 2D Quantum Simulator Captures Real-Time ‘String Breaking’
Applications
NISQ devices are making it possible to conduct useful research and investigate possible Applications in a number of fields:
- Quantum Chemistry and Materials Science: Simulating chemical processes and molecular structures, which might improve catalysis and drug development. For instance, Quandela pushes the limits of NISQ-era quantum computing by using a photonic method with the goal of reducing noise and scaling quantum systems.
- Variational Algorithms: In order to get practical results despite noise, algorithms such as the Quantum Approximate Optimisation Algorithm (QAOA) and Variational Quantum Eigensolver (VQE), which frequently employ hybrid quantum-classical techniques, are especially developed for NISQ devices.
- Optimization Problems: Tackling supply chain management, logistics, and finance-related duties.
- Quantum Machine Learning: Investigating methods for processing massive datasets and enhancing predictive analytics that are boosted by quantum technology.
- Quantum Simulation: Quantum system simulation for basic science studies.
- Cryptography: NISQ devices are used to investigate post-quantum cryptography and quantum key distribution for secure communication, even if they are not strong enough to crack existing public-key encryption.
Many quantum systems are now accessible via cloud platforms, which is also promoting basic research and supporting early adopters in their quest for immediate uses.
Path to Fault-Tolerant Quantum Computing
It is believed that the NISQ period will serve as a bridge between the present noisy systems and the fault-tolerant quantum computers of the future. The creation of completely error-corrected quantum computers that can tackle far more difficult and expansive issues is the ultimate objective. This change will necessitate:
- Improved Coherence and Quality of Qubits: Obtaining much longer coherence periods and much lower error rates for quantum gates in order to produce more stable qubits.
- Enhanced Quantum Error Correction: Creating effective and scalable QEC codes. Millions of physical qubits should be needed to encode fewer logical qubits in fault-tolerant quantum computers.
- Increased Qubit Count: Achieving far larger quantities of qubits than the tens to hundreds seen in NISQ devices.
- Novel Qubit Technologies: Investigating methods like topological qubits, which are intended to be more error-resistant by nature and are used in Microsoft’s Majorana 1 device.
The NISQ era’s lifespan is still up in the air, but analysts predict it will last for a number of years as researchers strive to create fault-tolerant systems. Early fault-tolerant machines may show scientific quantum advantage in the next years, with estimates for the realization of completely fault-tolerant quantum computing ranging from the late 2020s to the 2030s or later.
In summary, although NISQ computing is a complex sector with difficult issues to resolve, it is also a quickly developing stage propelled by a committed community of academics and business experts cooperating to get beyond these obstacles. The advancements established provide the foundations for the revolutionary possibilities of quantum technology and are fundamental to the larger things to come.
You can also read Karnataka Funds ₹48 Crore for Quantum Research Park phase 2