Compare Papers
Paper 1
Logical Error Rate Scaling of the Toric Code
Fern H. E. Watson, Sean D. Barrett
- Year
- 2013
- Journal
- arXiv preprint
- DOI
- arXiv:1312.5213
- arXiv
- 1312.5213
To date, a great deal of attention has focused on characterizing the performance of quantum error correcting codes via their thresholds, the maximum correctable physical error rate for a given noise model and decoding strategy. Practical quantum computers will necessarily operate below these thresholds meaning that other performance indicators become important. In this work we consider the scaling of the logical error rate of the toric code and demonstrate how, in turn, this may be used to calculate a key performance indicator. We use a perfect matching decoding algorithm to find the scaling of the logical error rate and find two distinct operating regimes. The first regime admits a universal scaling analysis due to a mapping to a statistical physics model. The second regime characterizes the behavior in the limit of small physical error rate and can be understood by counting the error configurations leading to the failure of the decoder. We present a conjecture for the ranges of validity of these two regimes and use them to quantify the overhead -- the total number of physical qubits required to perform error correction.
Open paperPaper 2
Decoder Switching: Breaking the Speed-Accuracy Tradeoff in Real-Time Quantum Error Correction
Riki Toshio, Kaito Kishi, Jun Fujisaki, Hirotaka Oshima, Shintaro Sato, Keisuke Fujii
- Year
- 2025
- Journal
- arXiv preprint
- DOI
- arXiv:2510.25222
- arXiv
- 2510.25222
The realization of fault-tolerant quantum computers hinges on the construction of high-speed, high-accuracy, real-time decoding systems. The persistent challenge lies in the fundamental trade-off between speed and accuracy: efforts to improve the decoder's accuracy often lead to unacceptable increases in decoding time and hardware complexity, while attempts to accelerate decoding result in a significant degradation in logical error rate. To overcome this challenge, we propose a novel framework, decoder switching, which balances these competing demands by combining a faster, soft-output decoder ("weak decoder") with a slower, high-accuracy decoder ("strong decoder"). In usual rounds, the weak decoder processes error syndromes and simultaneously evaluates its reliability via soft information. Only when encountering a decoding window with low reliability do we switch to the strong decoder to achieve more accurate decoding. Numerical simulations suggest that this framework can achieve accuracy comparable to, or even surpassing, that of the strong decoder, while maintaining an average decoding time on par with the weak decoder. We also develop an online decoding scheme tailored to our framework, named double window decoding, and elucidate the criteria for preventing an exponential slowdown of quantum computation. These findings break the long-standing speed-accuracy trade-off, paving the way for scalable real-time decoding devices.
Open paper