2020
DOI: 10.1016/j.neucom.2019.09.039
|View full text |Cite
|
Sign up to set email alerts
|

Extreme semi-supervised learning for multiclass classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
10
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 19 publications
0
10
0
Order By: Relevance
“…Hence, support is required to get the best-generated samples for the discriminator. So, the authors are motivated by some techniques 7,8,[10][11][12] in generating realistic data with multi-class attributes. These techniques 7,8,[10][11][12] can generate real data by considering only a particular set of features and hence fail to result in the equal distribution of the given dataset.…”
Section: Motivationmentioning
confidence: 99%
See 2 more Smart Citations
“…Hence, support is required to get the best-generated samples for the discriminator. So, the authors are motivated by some techniques 7,8,[10][11][12] in generating realistic data with multi-class attributes. These techniques 7,8,[10][11][12] can generate real data by considering only a particular set of features and hence fail to result in the equal distribution of the given dataset.…”
Section: Motivationmentioning
confidence: 99%
“…So, the authors are motivated by some techniques 7,8,[10][11][12] in generating realistic data with multi-class attributes. These techniques 7,8,[10][11][12] can generate real data by considering only a particular set of features and hence fail to result in the equal distribution of the given dataset. Thus, the research gaps associated with existing GAN models are as follows:…”
Section: Motivationmentioning
confidence: 99%
See 1 more Smart Citation
“…Examples include ELMs for online sequential learning [3], ELMs for imbalanced problems [4], ELMs for semi-supervised learning [5], ELMs for unsupervised learning [6], ELMs for compressive learning [8], etc. However, as analyzed in [9] [10], most of them still suffer from heavy overfitting and large model sizes for benchmark applications. This would restrict the broader applicability of ELMs, especially to computationally constrained devices such as mobile devices and wearables.…”
Section: Introductionmentioning
confidence: 99%
“…Thus, ELM is much faster and easier to implement than most state-of-the-art machine learning approaches. In the past decade, ELM theory and applications have attracted numerous attention, its variants and extensions have been developed for specific problems, such as online sequential learning [9], imbalance learning [10], multilabel learning [11], compressive learning [12], and compact modeling [13], etc. ELM was recently extended to hierarchical structure for dealing with complex tasks, i.e., multilayer ELM (ML-ELM).…”
Section: Introductionmentioning
confidence: 99%