Compare Papers

Paper 1

Characterizing large-scale quantum computers via cycle benchmarking.

Erhard A, Wallman JJ, Postler L, Meth M, Stricker R, Martinez EA, Schindler P, Monz T, Emerson J, Blatt R.

Year
2019
Journal
Nat Commun
DOI
10.1038/s41467-019-13068-7
arXiv
-

No abstract.

Open paper

Paper 2

Computational Complexity of Learning Efficiently Generatable Pure States

Taiga Hiroka, Min-Hsiu Hsieh

Year
2024
Journal
arXiv preprint
DOI
arXiv:2410.04373
arXiv
2410.04373

Understanding the computational complexity of learning efficient classical programs in various learning models has been a fundamental and important question in classical computational learning theory. In this work, we study the computational complexity of quantum state learning, which can be seen as a quantum generalization of distributional learning introduced by Kearns et.al [STOC94]. Previous works by Chung and Lin [TQC21], and Bădescu and O$'$Donnell [STOC21] study the sample complexity of the quantum state learning and show that polynomial copies are sufficient if unknown quantum states are promised efficiently generatable. However, their algorithms are inefficient, and the computational complexity of this learning problem remains unresolved. In this work, we study the computational complexity of quantum state learning when the states are promised to be efficiently generatable. We show that if unknown quantum states are promised to be pure states and efficiently generateable, then there exists a quantum polynomial time algorithm $A$ and a language $L \in PP$ such that $A^L$ can learn its classical description. We also observe the connection between the hardness of learning quantum states and quantum cryptography. We show that the existence of one-way state generators with pure state outputs is equivalent to the average-case hardness of learning pure states. Additionally, we show that the existence of EFI implies the average-case hardness of learning mixed states.

Open paper