Compare Papers
Paper 1
Fair Decoder Baselines and Rigorous Finite-Size Scaling for Bivariate Bicycle Codes on the Quantum Erasure Channel
Tushar Pandey
- Year
- 2026
- Journal
- arXiv preprint
- DOI
- arXiv:2603.19062
- arXiv
- 2603.19062
Fair threshold estimation for bivariate bicycle (BB) codes on the quantum erasure channel runs into two recurring problems: decoder-baseline unfairness and the conflation of finite-size pseudo-thresholds with true asymptotic thresholds. We run both uninformed and \emph{erasure-aware} minimum-weight perfect matching (MWPM) surface code baselines alongside BP-OSD decoding of BB codes. With standard depolarizing-weight MWPM and no erasure information, performance matches random guessing on the erasure channel in our tested regime -- so prior work that compares against this baseline is really comparing decoders, not codes. Using 200{,}000 shots per point and bootstrap confidence intervals, we sweep five BB code sizes from $N=144$ to $N=1296$. Pseudo-thresholds (WER = 0.10) run from $p^* = 0.370$ to $0.471$; finite-size scaling (FSS) gives an asymptotic threshold $p^*_\infty \approx 0.488$, within 2.4\% of the zero-rate limit and without maximum-likelihood decoding. On the fair baseline, BB at $N=1296$ has a modest edge in threshold over the surface code at twice the qubit count, and a 12$\times$ lower normalized overhead -- the latter is where the practical advantage sits. All runs are reproducible from recorded seeds and package versions.
Open paperPaper 2
ADaPT: Adaptive-window Decoding for Practical fault-Tolerance
Tina Oberoi, Joshua Viszlai, Frederic T. Chong
- Year
- 2026
- Journal
- arXiv preprint
- DOI
- arXiv:2605.01149
- arXiv
- 2605.01149
Window decoding, first proposed to reduce decoding complexity for real-time decoding, is an essential component to realize scalable, universal-fault tolerant computation. Prior work has focused on improving throughput through parallelization and reducing reaction time via speculation on window boundaries. However, these methods use a fixed window size d, paying a fixed decoding time overhead for each window. In practice, we find this overhead of a fixed window size unnecessary in many cases due to the sparsity of average-case errors in QEC. Leveraging this insight, in this paper we propose an adaptive window decoding technique based on decoder confidence. This technique reduces the overhead in decoding time thus reducing reaction time without compromising on logical error rates. We benchmark adaptive window decoding across different codes and hardware inspired noise models. Our results show that this adaptive technique reaches the target error rate while maintaining a low decoding time overhead across different codes, and under different noise models.
Open paper