Compare Papers

Paper 1

Pseudorandom Function from Learning Burnside Problem

Dhiraj K. Pandey, Antonio R. Nicolosi

Year
2025
Journal
Mathematics
DOI
10.3390/math13071193
arXiv
-

We present three progressively refined pseudorandom function (PRF) constructions based on the learning Burnside homomorphisms with noise (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>B</mi><mi>n</mi></msub></semantics></math></inline-formula>-LHN) assumption. A key challenge in this approach is error management, which we address by extracting errors from the secret key. Our first design, a direct pseudorandom generator (PRG), leverages the lower entropy of the error set (<i>E</i>) compared to the Burnside group (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>B</mi><mi>r</mi></msub></semantics></math></inline-formula>). The second, a parameterized PRG, derives its function description from public parameters and the secret key, aligning with the relaxed PRG requirements in the Goldreich–Goldwasser–Micali (GGM) PRF construction. The final indexed PRG introduces public parameters and an index to refine efficiency. To optimize computations in Burnside groups, we enhance concatenation operations and homomorphisms from <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>B</mi><mi>n</mi></msub></semantics></math></inline-formula> to <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mi>B</mi><mi>r</mi></msub></semantics></math></inline-formula> for <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>n</mi><mo>≫</mo><mi>r</mi></mrow></semantics></math></inline-formula>. Additionally, we explore algorithmic improvements and parallel computation strategies to improve efficiency.

Open paper

Paper 2

Quantum feature-map learning with reduced resource overhead

Jonas Jäger, Philipp Elsässer, Elham Torabian

Year
2025
Journal
arXiv preprint
DOI
arXiv:2510.03389
arXiv
2510.03389

Current quantum computers require algorithms that use limited resources economically. In quantum machine learning, success hinges on quantum feature maps, which embed classical data into the state space of qubits. We introduce Quantum Feature-Map Learning via Analytic Iterative Reconstructions (Q-FLAIR), an algorithm that reduces quantum resource overhead in iterative feature-map circuit construction. It shifts workloads to a classical computer via partial analytic reconstructions of the quantum model, using only a few evaluations. For each probed gate addition to the ansatz, the simultaneous selection and optimization of the data feature and weight parameter is then entirely classical. Integrated into quantum neural network and quantum kernel support vector classifiers, Q-FLAIR shows state-of-the-art benchmark performance. Since resource overhead decouples from feature dimension, we train a quantum model on a real IBM device in only four hours, surpassing 90% accuracy on the full-resolution MNIST dataset (784 features, digits 3 vs 5). Such results were previously unattainable, as the feature dimension prohibitively drives hardware demands for fixed and search costs for adaptive ansätze. By rethinking feature-map learning beyond black-box optimization, this work takes a concrete step toward enabling quantum machine learning for real-world problems and near-term quantum computers.

Open paper