2021
DOI: 10.1109/tnnls.2020.2979688
|View full text |Cite
|
Sign up to set email alerts
|

Convergence Analysis of Adaptive Exponential Functional Link Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 37 publications
0
8
0
Order By: Relevance
“…This discriminative feature extraction is performed by signal processing model shown in Fig 3. [14,15]. AEFLN is deployed for various non-linear applications such as system identification, echo cancellation, noise control.…”
Section: Feature Extraction Using Psd Techniquementioning
confidence: 99%
See 1 more Smart Citation
“…This discriminative feature extraction is performed by signal processing model shown in Fig 3. [14,15]. AEFLN is deployed for various non-linear applications such as system identification, echo cancellation, noise control.…”
Section: Feature Extraction Using Psd Techniquementioning
confidence: 99%
“…Feature extraction block using PSD technique 2. Feature Extraction using AFLN AFLN model is a modified version of Adaptive exponential Functional Link Network (AEFLN) proposed by[14,15]. AEFLN is deployed for various non-linear applications such as system identification, echo cancellation, noise control.…”
mentioning
confidence: 99%
“…Assumption 3: The error sequence e(n) is asymptotically uncorrelated with g(n) 2 and |h T w(n)| 2 . This has been commonly used for analyzing EFLN-based algorithms at steady-state [42,46].…”
Section: Steady-state Performancementioning
confidence: 99%
“…By IPLNAs in this paper, we consider a wide family of shallow neural networks including the extreme learning machine (ELM) or random vector functional link (RVFL) networks [1]- [3], other functional links and kernel neural architectures and filters, e.g., [4]- [6], and polynomial neural networks [7], [8] including basic standalone  structures called polynomial neural units in [9], [10], and references therein. This letter shows that the input-to-state stability (ISS) concept [11] and BIBS stability [12] generally apply to the gradient learning algorithms and their many modifications for IPLNA's.…”
Section: Introductionmentioning
confidence: 99%