Compare Papers
Paper 1
Artificial Intelligence for Quantum Error Correction: A Comprehensive Review
Zihao Wang, Hao Tang
- Year
- 2024
- Journal
- arXiv preprint
- DOI
- arXiv:2412.20380
- arXiv
- 2412.20380
Quantum Error Correction (QEC) is the process of detecting and correcting errors in quantum systems, which are prone to decoherence and quantum noise. QEC is crucial for developing stable and highly accurate quantum computing systems, therefore, several research efforts have been made to develop the best QEC strategy. Recently, Google's breakthrough shows great potential to improve the accuracy of the existing error correction methods. This survey provides a comprehensive review of advancements in the use of artificial intelligence (AI) tools to enhance QEC schemes for existing Noisy Intermediate Scale Quantum (NISQ) systems. Specifically, we focus on machine learning (ML) strategies and span from unsupervised, supervised, semi-supervised, to reinforcement learning methods. It is clear from the evidence, that these methods have recently shown superior efficiency and accuracy in the QEC pipeline compared to conventional approaches. Our review covers more than 150 relevant studies, offering a comprehensive overview of progress and perspective in this field. We organized the reviewed literature on the basis of the AI strategies employed and improvements in error correction performance. We also discuss challenges ahead such as data sparsity caused by limited quantum error datasets and scalability issues as the number of quantum bits (qubits) in quantum systems kept increasing very fast. We conclude the paper with summary of existing works and future research directions aimed at deeper integration of AI techniques into QEC strategies.
Open paperPaper 2
Decoder Switching: Breaking the Speed-Accuracy Tradeoff in Real-Time Quantum Error Correction
Riki Toshio, Kaito Kishi, Jun Fujisaki, Hirotaka Oshima, Shintaro Sato, Keisuke Fujii
- Year
- 2025
- Journal
- arXiv preprint
- DOI
- arXiv:2510.25222
- arXiv
- 2510.25222
The realization of fault-tolerant quantum computers hinges on the construction of high-speed, high-accuracy, real-time decoding systems. The persistent challenge lies in the fundamental trade-off between speed and accuracy: efforts to improve the decoder's accuracy often lead to unacceptable increases in decoding time and hardware complexity, while attempts to accelerate decoding result in a significant degradation in logical error rate. To overcome this challenge, we propose a novel framework, decoder switching, which balances these competing demands by combining a faster, soft-output decoder ("weak decoder") with a slower, high-accuracy decoder ("strong decoder"). In usual rounds, the weak decoder processes error syndromes and simultaneously evaluates its reliability via soft information. Only when encountering a decoding window with low reliability do we switch to the strong decoder to achieve more accurate decoding. Numerical simulations suggest that this framework can achieve accuracy comparable to, or even surpassing, that of the strong decoder, while maintaining an average decoding time on par with the weak decoder. We also develop an online decoding scheme tailored to our framework, named double window decoding, and elucidate the criteria for preventing an exponential slowdown of quantum computation. These findings break the long-standing speed-accuracy trade-off, paving the way for scalable real-time decoding devices.
Open paper