2010
DOI: 10.1007/s11265-009-0441-5
|View full text |Cite
|
Sign up to set email alerts
|

Tracking Forecast Memories for Stochastic Decoding

Abstract: This paper proposes Tracking Forecast Memories (TFMs) as a novel method for implementing re-randomization and de-correlation of stochastic bit streams in stochastic channel decoders. We show that TFMs are able to achieve decoding performance similar to that of the previous re-randomization methods in the literature (i.e., edge memories), but they exhibit much lower hardware complexity. We then present circuit topologies for analog implementation of TFMs.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…Stochastic decoders also require channel SNR estimation, so they share this fixed complexity cost with NGDBF. The most efficient stochastic decoding strategy is the Tracking Forecast Memory (TFM) described by Tehrani et al [18]. The TFM-based decoder requires 2d c − 1 XOR operations at each parity-check node, nearly twice the number of XNOR operations needed by GDBF algorithms.…”
Section: ) Comparison With Previous Gdbf Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Stochastic decoders also require channel SNR estimation, so they share this fixed complexity cost with NGDBF. The most efficient stochastic decoding strategy is the Tracking Forecast Memory (TFM) described by Tehrani et al [18]. The TFM-based decoder requires 2d c − 1 XOR operations at each parity-check node, nearly twice the number of XNOR operations needed by GDBF algorithms.…”
Section: ) Comparison With Previous Gdbf Algorithmsmentioning
confidence: 99%
“…The results shown in Figs. 18 and 19 represent single cases, and only partially demonstrate the superior convergence of NGDBF algorithms. To gain more insight into the convergence properties, we performed statistical analysis on the objective functions of several GDBF and NGDBF algorithms over many frames.…”
Section: Convergence Analysismentioning
confidence: 99%
“…In [14], tracking forecast memories (TFMs) are introduced to extract the mean of regenerative bits, as a replacement for edge-memories. They have the advantage of a lower hardware complexity, but do not improve the performance of the decoder, which remains approximately 0.5 dB away from 32 iterations floating-point SPA.…”
Section: B Stochastic Decodingmentioning
confidence: 99%
“…As was noted in [6] and [14], it is likely that the gap between the performance of floating-point SPA and of stochastic decoding is due to the variable node implementation. The RHS algorithm improves the accuracy of the variable node operation by extracting the information in the stochastic stream and performing exact operations in the LLR domain.…”
Section: A Algorithmmentioning
confidence: 99%
See 1 more Smart Citation