2019
DOI: 10.1109/access.2019.2910322
|View full text |Cite
|
Sign up to set email alerts
|

Multi-View Classification via a Fast and Effective Multi-View Nearest-Subspace Classifier

Abstract: Multi-view data represented in multiple views contains more complementary information than a single view, whereby multi-view learning explores and utilizes the multi-view data. In general, most existing multi-view learning methods consider the correlation between multiple views. However, the relationship between classes and views which is also important in multi-view learning has never been involved in the existing works. In this paper, we propose a fast and effective multi-view nearest-subspace classifier (MV… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(2 citation statements)
references
References 35 publications
0
2
0
Order By: Relevance
“…Several approaches have been proposed over the years for the analysis of multi-view data. They include techniques on clustering [Ye et al, 2018, Ou et al, 2018, classification [Shu et al, 2019], regression [Li et al, 2019], dimensionality reduction [Sun, 2013] and more [Xu et al, 2013, Zhao et al, 2017. However, the literature mostly lacks efficient algorithms that allow the construction of a single visualisation, through the simultaneous analysis of multiple data-views.…”
Section: Introductionmentioning
confidence: 99%
“…Several approaches have been proposed over the years for the analysis of multi-view data. They include techniques on clustering [Ye et al, 2018, Ou et al, 2018, classification [Shu et al, 2019], regression [Li et al, 2019], dimensionality reduction [Sun, 2013] and more [Xu et al, 2013, Zhao et al, 2017. However, the literature mostly lacks efficient algorithms that allow the construction of a single visualisation, through the simultaneous analysis of multiple data-views.…”
Section: Introductionmentioning
confidence: 99%
“…However, compatible and complementary information across all views can be typically underutilized. Recently, numerous multi-view clustering approaches [24,25,26,27,28,29] have been available. Bickel and Scheffer [30] extended the classic K-means and expectation-maximization clustering methods to the multi-view case to deal with text data with two views.…”
Section: Introductionmentioning
confidence: 99%