Compare Papers
Paper 1
Mitigating Depolarizing Noise on Quantum Computers with Noise-Estimation Circuits
Miroslav Urbanek, Benjamin Nachman, Vincent R. Pascuzzi, Andre He, Christian W. Bauer, Wibe A. de Jong
- Year
- 2021
- Journal
- Physical Review Letters
- DOI
- 10.1103/physrevlett.127.270502
- arXiv
- -
No abstract.
Open paperPaper 2
Tight Generalization Bound for Supervised Quantum Machine Learning
Xin Wang, Rebing Wu
- Year
- 2025
- Journal
- arXiv preprint
- DOI
- arXiv:2510.24348
- arXiv
- 2510.24348
We derive a tight generalization bound for quantum machine learning that is applicable to a wide range of supervised tasks, data, and models. Our bound is both efficiently computable and free of big-O notation. Furthermore, we point out that previous bounds relying on big-O notation may provide misleading suggestions regarding the generalization error. Our generalization bound demonstrates that for quantum machine learning models of arbitrary size and depth, the sample size is the most dominant factor governing the generalization error. Additionally, the spectral norm of the measurement observable, the bound and Lipschitz constant of the selected risk function also influence the generalization upper bound. However, the number of quantum gates, the number of qubits, data encoding methods, and hyperparameters chosen during the learning process such as batch size, epochs, learning rate, and optimizer do not significantly impact the generalization capability of quantum machine learning. We experimentally demonstrate the tightness of our generalization bound across classification and regression tasks. Furthermore, we show that our tight generalization upper bound holds even when labels are completely randomized. We thus bring clarity to the fundamental question of generalization in quantum machine learning.
Open paper