2022
DOI: 10.1007/s11063-021-10682-9
|View full text |Cite
|
Sign up to set email alerts
|

Continuous Recurrent Neural Networks Based on Function Satlins

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 31 publications
0
2
0
Order By: Relevance
“…Recent work in computer vision takes inspiration from PC theory to build models for accurate (Han et al, 2018) and robust (Huang et al, 2020) image classification. PredNet (Lotter et al, 2017) proposes a network capable of predicting future frames in a video sequence by making local predictions at each level using top-down connections.…”
Section: Predictive Coding and Deep Learningmentioning
confidence: 99%
“…Recent work in computer vision takes inspiration from PC theory to build models for accurate (Han et al, 2018) and robust (Huang et al, 2020) image classification. PredNet (Lotter et al, 2017) proposes a network capable of predicting future frames in a video sequence by making local predictions at each level using top-down connections.…”
Section: Predictive Coding and Deep Learningmentioning
confidence: 99%
“…Given a speech dataset of the clean-noisy pairs, the neural networks can learn to transform the noisy magnitude spectra to their clean counterparts (mapping based) [ 12 14 ] or estimate the time-frequency masks (masking-based) such as ideal binary mask (IBM) [ 15 , 16 ], ideal ratio mask (IRM) [ 17 , 18 ], and spectral magnitude mask (SMM) [ 19 ]. Fully connected networks (FCN) [ 19 ], feedforward neural networks (FDNN) with Kalman filtering [ 20 ], recurrent neural networks (RNN) [ 21 23 ], and convolutional neural networks (CNN) [ 24 , 25 ] are important deep learning approaches in SE.…”
Section: Introductionmentioning
confidence: 99%