2023
DOI: 10.20965/ijat.2023.p0206
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Scale Batch-Learning Growing Neural Gas Efficiently for Dynamic Data Distributions

Abstract: Growing neural gas (GNG) has many applications, including topology preservation, feature extraction, dynamic adaptation, clustering, and dimensionality reduction. These methods have broad applicability in extracting the topological structure of 3D point clouds, enabling unsupervised motion estimation, and depicting objects within a scene. Furthermore, multi-scale batch-learning GNG (MS-BL-GNG) has improved learning convergence. However, it is only implemented on static or stationary datasets, and adapting to d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 39 publications
0
6
0
Order By: Relevance
“…Moreover, both methods add a node in each epoch or batch, thus reducing the learning speed. To accelerate network growth, Fernando et al [3] propose using a "add-if-silent strategy" method on top of the MS-BL-GNG method to insert nodes into the network in each iteration (FastMS-BL-GNG). With this implementation, FastMS-BL-GNG is able to operate in dynamic or non-stationary data distributions.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Moreover, both methods add a node in each epoch or batch, thus reducing the learning speed. To accelerate network growth, Fernando et al [3] propose using a "add-if-silent strategy" method on top of the MS-BL-GNG method to insert nodes into the network in each iteration (FastMS-BL-GNG). With this implementation, FastMS-BL-GNG is able to operate in dynamic or non-stationary data distributions.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, we introduce batch learning with matrix computation to accelerate learning. There are various GNG batch learning methods [3,12,13]. One method is to use FCM to calculate the membership of each network node to each batch data [12].…”
Section: Batch Learning Growing Neural Gasmentioning
confidence: 99%
See 3 more Smart Citations