2011 IEEE International Symposium on Information Theory Proceedings 2011
DOI: 10.1109/isit.2011.6033714
|View full text |Cite
|
Sign up to set email alerts
|

Localized dimension growth in random network coding: A convolutional approach

Abstract: We propose an efficient Adaptive Random Convolutional Network Coding (ARCNC) algorithm to address the issue of field size in random network coding. ARCNC operates as a convolutional code, with the coefficients of local encoding kernels chosen randomly over a small finite field. The lengths of local encoding kernels increase with time until the global encoding kernel matrices at related sink nodes all have full rank. Instead of estimating the necessary field size a priori, ARCNC operates in a small finite field… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
11
0

Year Published

2011
2011
2016
2016

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 13 publications
1
11
0
Order By: Relevance
“…Assume s generates a source message per unit time, consisting of a fixed number of m source symbols represented by a size m row vector x t = (x 1,t , x 2,t , · · · , x m,t ), x i,t ∈ F q . Time t is indexed from 0, where the (t + 1)-th message generated at time t, consistent with our previous work [1], [10]. Source messages are collectively represented by a power series x(z) = t≥0 x t z t , where x t is the message generated at t and z denotes a unit-time delay.…”
Section: A Basic Model and Definitionsmentioning
confidence: 69%
See 2 more Smart Citations
“…Assume s generates a source message per unit time, consisting of a fixed number of m source symbols represented by a size m row vector x t = (x 1,t , x 2,t , · · · , x m,t ), x i,t ∈ F q . Time t is indexed from 0, where the (t + 1)-th message generated at time t, consistent with our previous work [1], [10]. Source messages are collectively represented by a power series x(z) = t≥0 x t z t , where x t is the message generated at t and z denotes a unit-time delay.…”
Section: A Basic Model and Definitionsmentioning
confidence: 69%
“…2 illustrates its topology. Assuming unit capacity links, the min-cut to each sink is m. In combination networks, routing is insufficient and network coding is needed to achieve the multicast capacity m. Here coding is performed only at s, since each intermediate node has only one incoming edge but s. For a general n m combination network, we showed in [1] that the expected average first decoding time can be significantly improved by ARCNC when compared to BNC. At time t − 1, for a sink r, F r (z) is a size m × m matrix of polynomials of degree t − 1.…”
Section: A Combination Networkmentioning
confidence: 95%
See 1 more Smart Citation
“…This has been adopted in the adaptive random construction algorithm for a CNC in [11]. It is a necessary but not sufficient condition to decode with delay L at sink r.…”
Section: Theoremmentioning
confidence: 99%
“…Efficient centralized construction of optimal convolutional network codes (CNCs) on a multicast network is discussed in [4] [6] [8] [7]. A distributed construction scheme for a CNC is proposed in [11] by adaptively and randomly assigning local encoding kernels.…”
Section: Introductionmentioning
confidence: 99%