Compare Papers

Paper 1

Non-Exponential Behaviour in Logical Randomized Benchmarking

Athena Ceasura, Pavithran Iyer, Joel J. Wallman, Hakop Pashayan

Year
2022
Journal
arXiv preprint
DOI
arXiv:2212.05488
arXiv
2212.05488

We construct a gate and time-independent noise model that results in the output of a logical randomized benchmarking protocol oscillating rather than decaying exponentially. To illustrate our idea, we first construct an example in standard randomized benchmarking where we assume the existence of ``hidden'' qubits, permitting a choice of representation of the Clifford group that contains multiplicities. We use the multiplicities to, with each gate application, update a hidden memory of the gate history that we use to circumvent theorems which guarantee the output decays exponentially. In our focal setting of logical randomized benchmarking, we show that the presence of machinery associated with the implementation of quantum error correction can facilitate non-exponential decay. Since, in logical randomized benchmarking, the role of the hidden qubits is assigned to the syndrome qubits used in error correction and these are strongly coupled to the logical qubits via a decoder.

Open paper

Paper 2

Decoder Switching: Breaking the Speed-Accuracy Tradeoff in Real-Time Quantum Error Correction

Riki Toshio, Kaito Kishi, Jun Fujisaki, Hirotaka Oshima, Shintaro Sato, Keisuke Fujii

Year
2025
Journal
arXiv preprint
DOI
arXiv:2510.25222
arXiv
2510.25222

The realization of fault-tolerant quantum computers hinges on the construction of high-speed, high-accuracy, real-time decoding systems. The persistent challenge lies in the fundamental trade-off between speed and accuracy: efforts to improve the decoder's accuracy often lead to unacceptable increases in decoding time and hardware complexity, while attempts to accelerate decoding result in a significant degradation in logical error rate. To overcome this challenge, we propose a novel framework, decoder switching, which balances these competing demands by combining a faster, soft-output decoder ("weak decoder") with a slower, high-accuracy decoder ("strong decoder"). In usual rounds, the weak decoder processes error syndromes and simultaneously evaluates its reliability via soft information. Only when encountering a decoding window with low reliability do we switch to the strong decoder to achieve more accurate decoding. Numerical simulations suggest that this framework can achieve accuracy comparable to, or even surpassing, that of the strong decoder, while maintaining an average decoding time on par with the weak decoder. We also develop an online decoding scheme tailored to our framework, named double window decoding, and elucidate the criteria for preventing an exponential slowdown of quantum computation. These findings break the long-standing speed-accuracy trade-off, paving the way for scalable real-time decoding devices.

Open paper