[Proceedings 1992] IJCNN International Joint Conference on Neural Networks
DOI: 10.1109/ijcnn.1992.227326
|View full text |Cite
|
Sign up to set email alerts
|

Error correcting neural networks for channels with Gaussian noise

Abstract: The adaptability and parallel computing capabilities of neural networks make them specially adequate for error correcting tasks. Feed forward neural networks for soft-decision decoding of block codes in channels with additive white gaussian noise are presented. When the noise is not white, we deduce the optimal set of weights for the connections of the network. These weights are also approximately obtain by an error back propagation algorithm. A practical realization for a BCH (7,4) code is presented, and exha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(15 citation statements)
references
References 4 publications
0
15
0
Order By: Relevance
“…Researchers have applied ML to the physical layer for modulation recognition [14], [15], channel modeling and identification [16], [17], encoding and decoding [18], [19], channel estimation [20], and equalization [21], [22] (see further details in [23] and [24]); however, ML has been unused commercially because handling physical channels is a complex process, and conventional ML algorithms have limited learning capacity. Researchers believe that ML can achieve further performance improvements by introducing DL to the physical layer.…”
Section: Introductionmentioning
confidence: 99%
“…Researchers have applied ML to the physical layer for modulation recognition [14], [15], channel modeling and identification [16], [17], encoding and decoding [18], [19], channel estimation [20], and equalization [21], [22] (see further details in [23] and [24]); however, ML has been unused commercially because handling physical channels is a complex process, and conventional ML algorithms have limited learning capacity. Researchers believe that ML can achieve further performance improvements by introducing DL to the physical layer.…”
Section: Introductionmentioning
confidence: 99%
“…Initially applied to upper layers [4], ML has recently found applications also at the PHY Layer [5]- [7] such as channel coding [8]- [10], modulation recognition [7], obstacle detection [11], [12] and physical layer security [13] etc. The use of ML for channel estimation was initially investigated in works such as [14].…”
Section: A Related State Of the Artmentioning
confidence: 99%
“…Each sidelink subframe (1 ms) contains 14 OFDM symbols out of which 10 are used for carrying user data and the remaining 4 (at positions [2,5,8,11] with 0-based indexing) are used for carrying DMRS symbols. The DMRS symbols are sequences r (α) u,v that are obtained by a cyclic shift of a base sequence r u,v (n) according to…”
Section: Channel Estimation In C-v2xmentioning
confidence: 99%
“…At the same time, deep learning has also become a research hotspot and has been applied to data analysis, speech recognition, image recognition, and other fields. The fitting and expressive abilities of neural networks are potent, and deep learning is suitable for HDPC decoding [1,2].…”
Section: Introductionmentioning
confidence: 99%