2023
DOI: 10.3390/app13179689
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional-Neural-Network-Based Hexagonal Quantum Error Correction Decoder

Aoqing Li,
Fan Li,
Qidi Gan
et al.

Abstract: Topological quantum error-correcting codes are an important tool for realizing fault-tolerant quantum computers. Heavy hexagonal coding is a new class of quantum error-correcting coding that assigns physical and auxiliary qubits to the vertices and edges of a low-degree graph. The layout of heavy hexagonal codes is particularly suitable for superconducting qubit architectures to reduce frequency conflicts and crosstalk. Although various topological code decoders have been proposed, constructing the optimal dec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 50 publications
0
1
0
Order By: Relevance
“…Among them, various neural networks have different decoding methods, and various neural networks have also been proposed, [27][28][29][30] such as feedforward neural networks (FFNN), recurrent neural networks (RNN), and convolutional neural networks (CNN). Decoding based on neural network [31][32][33] shortens the time required for decoding and can achieve an effect similar to that of classical decoding algorithms. However, as the code distance increases, the number of samples required for neural network training increases exponentially.…”
Section: Introductionmentioning
confidence: 99%
“…Among them, various neural networks have different decoding methods, and various neural networks have also been proposed, [27][28][29][30] such as feedforward neural networks (FFNN), recurrent neural networks (RNN), and convolutional neural networks (CNN). Decoding based on neural network [31][32][33] shortens the time required for decoding and can achieve an effect similar to that of classical decoding algorithms. However, as the code distance increases, the number of samples required for neural network training increases exponentially.…”
Section: Introductionmentioning
confidence: 99%