2012
DOI: 10.1109/jproc.2012.2197809
|View full text |Cite
|
Sign up to set email alerts
|

Robust and Scalable Graph-Based Semisupervised Learning

Abstract: Graph-based semisupervised learning methods and new techniques for handling contaminated noisy labels, and gigantic data sizes for web applications, are reviewed in this paper.By Wei Liu, Jun Wang, Member IEEE, and Shih-Fu Chang, Fellow IEEE ABSTRACT | Graph-based semisupervised learning (GSSL) provides a promising paradigm for modeling the manifold structures that may exist in massive data sources in highdimensional spaces. It has been shown effective in propagating a limited amount of initial labels to a lar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
87
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 177 publications
(88 citation statements)
references
References 39 publications
1
87
0
Order By: Relevance
“…(3) Extended MNIST: This dataset is widely used in many large-scale graph-based works [14], [19], [24]. The original The best results are shown in bold.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…(3) Extended MNIST: This dataset is widely used in many large-scale graph-based works [14], [19], [24]. The original The best results are shown in bold.…”
Section: Methodsmentioning
confidence: 99%
“…However, since it requires a dense weight matrix to build the relationships between each datapoint and all anchors, its storage requirement becomes impractical for large-scale datasets. Liu et al [23], [24] then presented anchor graph models by constructing the inter-layer edges between datapoints and their nearby anchors. Besides, they also introduced a geometric reconstruction method for their weight estimation to improve its effectiveness.…”
Section: Related Workmentioning
confidence: 99%
“…In this regard, recent experimental results with anchor graphs suggest a way to proceed. In [8] [7][2], the predictive power of non-parametric regression rooted in the anchors/landmarks ensures a way of constructing very informative weighted kNN graphs from a reduced set of representatives (anchors). Since anchor graphs are bipartite (only data-to-anchor edges exist), this representation bridges the sparsity of the pattern space because a random walk traveling from node u to node v must reach one or more anchors in advance.…”
Section: Motivationmentioning
confidence: 99%
“…In [6] an approximated kNN is obtained in O(dn t ) with t ∈ (1, 2) by recursively dividing and glueing the samples. More recently, anchor graphs [15] [13] provide data-to-anchor kNN graphs, where m ≪ n is a set of representatives (anchors) typically obtained through K-means clustering, in O(dmnT + dmn) where O(dmnT ) is due to the T iterations of the K-means process. These graphs tend to make out-of-the-sample predictions compatible with those of Nyström approximations, and in turn their approximated adjacency/affinity matrices are ensured to be positive semidefinite.…”
Section: Motivationmentioning
confidence: 99%