2014 8th International Symposium on Turbo Codes and Iterative Information Processing (ISTC) 2014
DOI: 10.1109/istc.2014.6955084
|View full text |Cite
|
Sign up to set email alerts
|

Approximation of activation functions for vector equalization based on recurrent neural networks

Abstract: Activation functions represent an essential element in all neural networks structures. They influence the overall behavior of neural networks decisively because of their nonlinear characteristic. Discrete-and continuous-time recurrent neural networks are a special class of neural networks. They have been shown to be able to perform vector equalization without the need for a training phase because they are Lyapunov stable under specific conditions. The activation function in this case depends on the symbol alph… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
1
1
1
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 8 publications
0
9
0
Order By: Relevance
“…18 Layout of the vector equalizer and pin configuration. 9,10,11,12,23,24,25,26), six pins for the weights configuration (pads 13,14,18,19,20,21), reset (pad 15), voltage supplies (pads 16,17,22) and grounds (square pads). The active area is approximately 0.09 mm 2 , with a transistor count CNT = 171 for four neurons.…”
Section: Measurement Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…18 Layout of the vector equalizer and pin configuration. 9,10,11,12,23,24,25,26), six pins for the weights configuration (pads 13,14,18,19,20,21), reset (pad 15), voltage supplies (pads 16,17,22) and grounds (square pads). The active area is approximately 0.09 mm 2 , with a transistor count CNT = 171 for four neurons.…”
Section: Measurement Resultsmentioning
confidence: 99%
“…for l = t equ . The above stated conditions are valid for BPSK, but can be generalized by combining the results of [10,18,19].…”
Section: Equalization Based On Continuous-time Recurrent Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…The mapping from q c to x is performed by M. Each symbol ψ represents m bits. A special class of symbol alphabets are the socalled separable symbol alphabet ψ ðsÞ [19,20].…”
Section: Block Transmission Modelmentioning
confidence: 99%
“…However, this was limited to the binary phaseshift keying (BPSK) symbol alphabet ψ ={−1, +1}. This has been generalized to complex-valued symbol alphabets in [21] by combining the results of references [20,22,32] 5 . Based thereon, it has been proven that the RNN ends in a local minimum of Eq.…”
Section: A Vector Equalization Based On Rnnmentioning
confidence: 99%