2015
DOI: 10.1016/j.physleta.2015.05.032
|View full text |Cite
|
Sign up to set email alerts
|

Decoding suprathreshold stochastic resonance with optimal weights

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
8
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 45 publications
1
8
0
Order By: Relevance
“…In the case of stationary inputs, after successive iterations of the Kalman–LMS recursive algorithm it converges to the optimum Wiener solution in some statistical sense [ 24 ]. Our previous work [ 17 ] has shown that, for the case of identical thresholds, the optimal weighted decoding is equivalent to Wiener linear decoding. Therefore, for stationary inputs, the decoding performance exploiting Kalman–LMS recursive algorithm is consistent with that of optimal weighted decoding in the case of identical thresholds.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…In the case of stationary inputs, after successive iterations of the Kalman–LMS recursive algorithm it converges to the optimum Wiener solution in some statistical sense [ 24 ]. Our previous work [ 17 ] has shown that, for the case of identical thresholds, the optimal weighted decoding is equivalent to Wiener linear decoding. Therefore, for stationary inputs, the decoding performance exploiting Kalman–LMS recursive algorithm is consistent with that of optimal weighted decoding in the case of identical thresholds.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Here, the array sizes N =1, 3, 15 and 63 (from top to bottom), and the stationary input signal is Gaussian distributed. The circled red lines correspond to the MSE distortion for Kalman–LMS recursive algorithm, and the blue stars represent the MSE distortion for optimal weighted decoding presented in [ 17 ]. …”
Section: Conclusion and Discussionmentioning
confidence: 99%
See 3 more Smart Citations