2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489776
|View full text |Cite
|
Sign up to set email alerts
|

Online Semi-supervised Growing Neural Gas for Multi-label Data Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…[14,15] presented online multilabel algorithms in which the dataset is divided into many single-label datasets to solve the multi-label classification problem. The online multi-label semisupervised (OMLSS) [16] introduced non-local labeling functions taking into account the topology of the network in the prediction of a label and improving the influence of the labeling strategy on the topology of the network in the multi-label case, using the labels to improve the synaptic links.…”
Section: Related Work 1online Learningmentioning
confidence: 99%
“…[14,15] presented online multilabel algorithms in which the dataset is divided into many single-label datasets to solve the multi-label classification problem. The online multi-label semisupervised (OMLSS) [16] introduced non-local labeling functions taking into account the topology of the network in the prediction of a label and improving the influence of the labeling strategy on the topology of the network in the multi-label case, using the labels to improve the synaptic links.…”
Section: Related Work 1online Learningmentioning
confidence: 99%
“…Online learning and class-incremental learning: Online learning is defined as follows. The training data is input in a sequence, and the model parameters are updated online to provide better results for new test samples after each iteration [21][22][23][24][25]. Class-incremental learning can be used to segment training data into sequences according to tasks, process the tasks in order, incrementally learn a classification model, and then effectively classify any new labels in task sequence [26][27][28][29].…”
Section: Related Workmentioning
confidence: 99%
“…Many kinds of researches have been proposed in the past few decades to solve the problem of new labels, such as online learning [21][22][23][24][25], class incremental learning [26][27][28][29], zero-shot learning [30][31][32][33], Generalized out-of-distribution detection [34][35][36][37][38][39][40][41][42][43][44][45], etc. In the zero-shot learning problem, the information of new labels is usually known during the training stage, including the semantic information and their total numbers.…”
Section: Introductionmentioning
confidence: 99%
“…The learning process for larger data sets is more agile than other commonly used cluster examination techniques. It does not require significant RAM resources (Migdał-Najman and Najman 2013; Boulbazine et al 2018).…”
Section: Artificial Neural Network Type Gngmentioning
confidence: 99%