2021
DOI: 10.1016/j.eswa.2021.115662
|View full text |Cite
|
Sign up to set email alerts
|

A self-organizing incremental neural network for continual supervised learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 22 publications
0
10
0
Order By: Relevance
“…In [33], the authors have introduced a new memory population approach (CBRS) for continual online learning that deals with imbalanced and temporally correlated data. Other pertinent methods for enhancing Online Continual Learning were suggested in [34] [35] [36]. For data deduplication, even if the deduplication model is trained with high-quality pairs, features defining duplications may change over time, especially when data is human input.…”
Section: E Continual Learning (Model Retraining)mentioning
confidence: 99%
“…In [33], the authors have introduced a new memory population approach (CBRS) for continual online learning that deals with imbalanced and temporally correlated data. Other pertinent methods for enhancing Online Continual Learning were suggested in [34] [35] [36]. For data deduplication, even if the deduplication model is trained with high-quality pairs, features defining duplications may change over time, especially when data is human input.…”
Section: E Continual Learning (Model Retraining)mentioning
confidence: 99%
“…Typical classifiers of this type are Episodic-GWR [38] and ASOINN Classifier (ASC) [8], which utilize GWR and ASOINN, respectively. One state-of-the-art algorithm is SOINN+ with ghost nodes (GSOINN+) [10]. GSOINN+ has successfully improved the classification performance by generating some ghost nodes near a decision boundary of each class.…”
Section: B Classification Algorithms Capable Of Continual Learningmentioning
confidence: 99%
“…This section presents quantitative comparisons for classification performance of ASC [8], FTCAC, SOINN+C, GSOINN+ [10], CAEAC, CAEAC-I, and CAEAC-C. The source code of ASC 2 , FTCA 3 , SOINN+ 4 , and GSOINN+ 5 is provided by the authors of the related papers.…”
Section: Simulation Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…As a GNG-based algorithm, Grow When Required (GWR) [18] and gamma-GWR [19] are successful algorithms which appropriately calculate a similarity threshold to prevent an excessive node creation. As a SOINN-based algorithm, SOINN+ [20] and SOINN+ with ghost nodes (GSOINN+) [21] can detect clusters of arbitrary shapes in noisy data streams while avoiding catastrophic forgetting. Another successful approach is ART-based clustering algorithms such as Fuzzy ART [7], Bayes ART [8], and their variants [9], [10].…”
Section: Literature Review a Clustering Algorithms Capable Of Continu...mentioning
confidence: 99%