2020
DOI: 10.1109/access.2020.2998428
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Adaptive Multi-View Non-Negative Graph Semi-Supervised ELM

Abstract: This paper represents a semi-supervised learning framework, which integrates multi-view learning, extreme learning machine (ELM) and graph-based semi-supervised learning. The aim is to expand the scope of adaptation of non-negative sparse graph (NNSG) framework, under a multi-view condition and a non-linear relationship. The proposed multi-view learning method will be adaptive since when data is single-view the framework will degenerate into an embedded framework for NNSG framework. The proposed ELM method als… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“… 2020 ; Zheng et al. 2020 ). By using the regularized correntropy criterion and half-quadratic optimization technique, convergence speed and performance are both showed superiorities than the original (Yang et al.…”
Section: Introductionmentioning
confidence: 99%
“… 2020 ; Zheng et al. 2020 ). By using the regularized correntropy criterion and half-quadratic optimization technique, convergence speed and performance are both showed superiorities than the original (Yang et al.…”
Section: Introductionmentioning
confidence: 99%
“…SSL is a learning strategy to improve the performance of deep neural networks by utilizing unlabeled samples when the amount of labeled data are sparse [13]. Current SSL methods can be categorized into four groups, including generative models [14], consistency regularization [15], graph neural network models (GNNs) [16], and pseudo-labeling [17].…”
Section: Semi-supervised Learningmentioning
confidence: 99%
“…Generalization performance is the main concern for the learning algorithms, balancing computational complexity and generalization ability have been extended via ELM (Ragusa et al, 2020); with the designed data and modeled parallel ELMs, large-scale learning tasks could be tackled by ELM (Ming et al, 2018), moreover, a tradeoff should be made among efficiency and scalability, the algorithm should have complementary advantages. With the aid of graph learning and adaptive unsupervised/semi-supervised clustering method, flexible and discriminative data embedding could be achieved (Zeng et al, 2020b;Zheng et al, 2020). By using the regularized correntropy criterion and half-quadratic optimization technique, convergence speed and performance are both showed superiorities than the original (Yang et al, 2020a), and the robust type algorithm has been studied in (Yang et al, 2020b).…”
Section: Introductionmentioning
confidence: 99%