Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing 2018
DOI: 10.1145/3188745.3188886
|View full text |Cite
|
Sign up to set email alerts
|

Efficient decoding of random errors for quantum expander codes

Abstract: We show that quantum expander codes, a constant-rate family of quantum LDPC codes, with the quasi-linear time decoding algorithm of Leverrier, Tillich and Zémor can correct a constant fraction of random errors with very high probability. This is the first construction of a constant-rate quantum LDPC code with an efficient decoding algorithm that can correct a linear number of random errors with a negligible failure probability. Finding codes with these properties is also motivated by Gottesman's construction o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
54
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 35 publications
(56 citation statements)
references
References 26 publications
2
54
0
Order By: Relevance
“…Another limitation of our work is the very small threshold value that it yields. While the threshold is usually expected to lie between 10 −3 and 10 −2 for the best constructions based on code concatenation, we expect our value to be several orders of magnitude smaller, as this was already the case in Gottesman's paper [14] and in our previous work with perfect syndrome measurement [8]. Part of the explanation is due to the very crude bounds that we obtain via percolation theory arguments.…”
Section: Introductionmentioning
confidence: 57%
See 3 more Smart Citations
“…Another limitation of our work is the very small threshold value that it yields. While the threshold is usually expected to lie between 10 −3 and 10 −2 for the best constructions based on code concatenation, we expect our value to be several orders of magnitude smaller, as this was already the case in Gottesman's paper [14] and in our previous work with perfect syndrome measurement [8]. Part of the explanation is due to the very crude bounds that we obtain via percolation theory arguments.…”
Section: Introductionmentioning
confidence: 57%
“…In order to analyze random errors of linear weight, we show using percolation theory that, with high probability, the error forms clusters in the sense of connected α-subsets (Definition 20 and Lemma 27). This is similar to the analysis in [17,8], except that we use the syndrome adjacency graph of the code (as in [14]) to establish the "locality" of the decoding algorithm, implying that each cluster of the error is corrected independently of the other ones (Lemma 19). Using the fact that clusters are of size bounded by the minimum distance of the code, the result on low weight errors shows that the size of E ⊕Ê is controlled by the syndrome error size.…”
Section: Main Results and Proof Techniquesmentioning
confidence: 79%
See 2 more Smart Citations
“…But near optimal (or very fast suboptimal) decoding algorithms are already proposed for these codes [28][29][30][31], which exploit their regular lattice structure. In contrast, for quantum LDPC codes, which are defined on random graphs, only recently has a decoding algorithm been found for the special family of expander codes [7,32,33] and the general case remains open. Our main motivation to study this problem is that quantum LDPC codes have the potential of greatly reducing the overhead required to realize robust quantum processors [34,35].…”
mentioning
confidence: 99%