2001
DOI: 10.1109/18.910574
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic programming and the graphical representation of error-correcting codes

Abstract: Abstract-Graphical representations of codes facilitate the design of computationally efficient decoding algorithms. This is an example of a general connection between dependency graphs, as arise in the representations of Markov random fields, and the dynamic programming principle. We concentrate on two computational tasks: finding the maximum-likelihood codeword and finding its posterior probability, given a signal received through a noisy channel. These two computations lend themselves to a particularly elega… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0
1

Year Published

2004
2004
2010
2010

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 30 publications
(24 citation statements)
references
References 40 publications
0
23
0
1
Order By: Relevance
“…Successful examples of object recognition include mine detection by Raphael [26] and recognition of mathematical formulas by Chou [27]. There is an application to error-correcting codes in which the information is embedded in each three nodes in a tree-structured MRF [28].…”
Section: Discussionmentioning
confidence: 99%
“…Successful examples of object recognition include mine detection by Raphael [26] and recognition of mathematical formulas by Chou [27]. There is an application to error-correcting codes in which the information is embedded in each three nodes in a tree-structured MRF [28].…”
Section: Discussionmentioning
confidence: 99%
“…For example, the most widely used decoding technique is the Viterbi algorithm [29]. Its computation of the most likely configuration can be achieved within O(n|S| 2 ) operations [12]. Now several of its variant algorithms are also popular for different purposes, such as the posterior-Viterbi algorithm.…”
Section: δS(t + 1) = F S (T) ϑmentioning
confidence: 99%
“…This corresponds to a very simple linear graphic structure sequentially linking S(1) through S(n). See Geman and Kochanek [12] for an example of a HMM represented by a more complex graph.…”
Section: Heuristic Ideas Event Aggregating Patterns and Maximum Entropymentioning
confidence: 99%
See 2 more Smart Citations