2017 IEEE Information Theory Workshop (ITW) 2017
DOI: 10.1109/itw.2017.8278028
|View full text |Cite
|
Sign up to set email alerts
|

On the weight hierarchy of locally repairable codes

Abstract: An (n, k, r) locally repairable code (LRC) is an [n, k, d] linear code where every code symbol can be repaired from at most r other code symbols. An LRC is said to be optimal if the minimum distance attains the Singleton-like bound d ≤ n−k −⌈k/r⌉+2. The generalized Hamming weights (GHWs) of linear codes are fundamental parameters which have many useful applications. Generally it is difficult to determine the GHWs of linear codes. In this paper, we study the GHWs of LRCs. Firstly, we obtain a generalized Single… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 25 publications
0
6
0
Order By: Relevance
“…Therefore, defining the largest possible minimum distance of an [n ′ , k ′ ] q linear code by d (opt) q (n ′ , k ′ ), we have the following result [17].…”
Section: Proposition 1 ([15]mentioning
confidence: 99%
“…Therefore, defining the largest possible minimum distance of an [n ′ , k ′ ] q linear code by d (opt) q (n ′ , k ′ ), we have the following result [17].…”
Section: Proposition 1 ([15]mentioning
confidence: 99%
“…In this paper, we generalize the results of [1] to study the GHWs of this more general class of q-ary (n, k, r, δ)-LRCs with δ ≥ 2 and their dual codes. The upper bounds and lower bounds on GHWs of (n, k, r)-LRCs in [1] can be obtained by the results in this paper when δ = 2. The results of this paper are summarized as follows.…”
Section: Introductionmentioning
confidence: 89%
“…When r = k, the above bound reduces to the classical Singleton bound d ≤ n − k + 1. An (n, k, r)-LRC is said to be optimal if its minimum distance attains the bound (1).…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations