Medicine Technology 🌱 Environment Space Energy Physics Engineering Social Science Earth Science Science
Technology 2026-03-04 3 min read

Quantum Computers Cannot Fix Themselves - But a New Technique Brings That Closer

A streamlined approach to quantum process tomography cuts the measurements needed to characterize what a quantum device is actually doing, without sacrificing accuracy.

There is a gap between what a quantum computer is supposed to do and what it actually does. Bridging that gap - measuring it, understanding it, correcting for it - is one of the central engineering challenges of the field. And right now, the tools for doing so are expensive in the one currency quantum systems can least afford to spend: measurements.

Every time you measure a quantum system, you disturb it. That is not a technical limitation waiting to be overcome with better equipment. It is a fundamental feature of quantum mechanics. Which means that characterizing what a quantum device is actually doing - a process called quantum process tomography, or QPT - requires a careful choreography of how and when you look.

The Verification Problem in Quantum Hardware

Quantum computers work by applying sequences of operations called gates to quantum states. In theory, those gates perform precise mathematical transformations. In practice, they deviate from ideal behavior because of device imperfections, thermal noise, electromagnetic interference, and a host of other environmental factors.

The deviation matters. Small errors in individual gates compound rapidly when gates are chained together in a computation. A quantum circuit that should produce a specific output may instead produce noise - and without a way to characterize the errors, you cannot correct them systematically.

Quantum process tomography addresses this by reconstructing a mathematical description of what each gate is actually doing. But the standard approaches require a number of measurements that scales exponentially with the number of qubits involved. For small systems this is manageable. For the larger systems where quantum computing's potential advantages actually materialize, it becomes impractical fast.

A Smarter Framework: Return to Input

Researchers from Tohoku University, NAIST, and the University of Information Technology have developed a method they call compilation-based quantum process tomography, or CQPT. The core idea is a "return-to-input" model: rather than probing a quantum process with arbitrary input states and measuring arbitrary outputs, the framework designs measurement sequences that are optimized to extract the maximum information with the minimum number of steps.

The team developed two complementary versions of CQPT, based on two different mathematical representations of quantum processes - Kraus operators and Choi matrices. Each handles a different class of quantum operations, making the framework broadly applicable rather than narrowly specialized.

Dr. Ho, one of the study's authors, framed the stakes plainly: "Efficient and scalable methods for characterizing quantum processes are important for the future of quantum computing." The paper, published in Advanced Quantum Technologies on February 26, 2026, demonstrates the framework's theoretical properties and validates it through simulations.

What It Can and Cannot Do Yet

The simulations show that CQPT achieves accurate process reconstruction while requiring fewer measurements than conventional approaches. That is a meaningful advance on paper. But the study stops short of experimental implementation on real quantum hardware.

That gap matters. Simulations assume idealized conditions that real devices do not enjoy. Noise, decoherence, and gate crosstalk in actual quantum processors can interact with measurement protocols in ways that simulations do not capture. Whether CQPT performs as well on hardware as it does in theory is the next question the researchers say they plan to address.

Scalability to larger qubit counts also remains to be demonstrated. The framework's design philosophy is aimed at scalability, but scalability claims in quantum computing tend to deserve scrutiny until they survive contact with real hardware at meaningful scales.

Still, the theoretical contribution is solid. Reducing the measurement overhead for process characterization is a genuine step toward the kind of reliable, verifiable quantum systems that the field needs before it can deliver on its broader promises. The more efficiently engineers can understand what their hardware is doing, the more precisely they can fix it.

Source: Tohoku University, NAIST, and the University of Information Technology. Study published in Advanced Quantum Technologies, February 26, 2026. Media contact: Tohoku University Public Relations, public_relations@grp.tohoku.ac.jp.