Compare Papers

Paper 1

Convolutional neural network based decoders for surface codes

Simone Bordoni, Stefano Giagu

Year
2023
Journal
arXiv preprint
DOI
arXiv:2312.03508
arXiv
2312.03508

The decoding of error syndromes of surface codes with classical algorithms may slow down quantum computation. To overcome this problem it is possible to implement decoding algorithms based on artificial neural networks. This work reports a study of decoders based on convolutional neural networks, tested on different code distances and noise models. The results show that decoders based on convolutional neural networks have good performance and can adapt to different noise models. Moreover, explainable machine learning techniques have been applied to the neural network of the decoder to better understand the behaviour and errors of the algorithm, in order to produce a more robust and performing algorithm.

Open paper

Paper 2

Communication-efficient Quantum Algorithm for Distributed Machine Learning

Hao Tang, Boning Li, Guoqing Wang, Haowei Xu, Changhao Li, Ariel Barr, Paola Cappellaro, Ju Li

Year
2022
Journal
arXiv preprint
DOI
arXiv:2209.04888
arXiv
2209.04888

The growing demands of remote detection and increasing amount of training data make distributed machine learning under communication constraints a critical issue. This work provides a communication-efficient quantum algorithm that tackles two traditional machine learning problems, the least-square fitting and softmax regression problem, in the scenario where the data set is distributed across two parties. Our quantum algorithm finds the model parameters with a communication complexity of $O(\frac{\log_2(N)}ε)$, where $N$ is the number of data points and $ε$ is the bound on parameter errors. Compared to classical algorithms and other quantum algorithms that achieve the same output task, our algorithm provides a communication advantage in the scaling with the data volume. The building block of our algorithm, the quantum-accelerated estimation of distributed inner product and Hamming distance, could be further applied to various tasks in distributed machine learning to accelerate communication.

Open paper