2018
DOI: 10.1007/s00500-018-3109-x
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive multiple graph regularized semi-supervised extreme learning machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 25 publications
(9 citation statements)
references
References 55 publications
0
9
0
Order By: Relevance
“…In recent years, how to use local consistency of data for learning to improve the performance of machine learning methods that has attracted researchers' attention [45]. Based on the theory that similar samples should have similar properties, the graph regularization is combined with our method to preserve the local structural information, which may improve the classification performance of the method [13,47]. We use the label information of the training sample to construct an adjacent graph, and the regularization term of the graph is integrated to constrain the output weight matrix, so as to learn the similar output of similar samples.…”
Section: Proposed Csrgelmmentioning
confidence: 99%
“…In recent years, how to use local consistency of data for learning to improve the performance of machine learning methods that has attracted researchers' attention [45]. Based on the theory that similar samples should have similar properties, the graph regularization is combined with our method to preserve the local structural information, which may improve the classification performance of the method [13,47]. We use the label information of the training sample to construct an adjacent graph, and the regularization term of the graph is integrated to constrain the output weight matrix, so as to learn the similar output of similar samples.…”
Section: Proposed Csrgelmmentioning
confidence: 99%
“…After feature extraction, the objective of the next step is to assign each pixel to one of two categories: shadow and object. The ELM [43][44][45][46][47][48] is adopted for classification, which has been proven to be effective and efficient in addressing a variety of classification problems. Next, the classification procedure using the ELM algorithm is described in detail.…”
Section: Classification Using Extreme Learning Machinementioning
confidence: 99%
“…As a special type of single-hidden layer feed-forward neural network, an extreme learning machine (ELM) has been extensively applied in several fields of machine learning [43][44][45][46][47][48], in which the parameters are randomly generated and the values of parameters do not need to be tuned. Ghimire and Lee [48] proposed an online sequential extreme learning machine-based semi-supervised technique for moving cast shadow detection, which provided better generalization performance.…”
Section: Introductionmentioning
confidence: 99%
“…For example, the largest value of f ij means the highest probability of the sample x i belonging to j-th class. In terms of the LP theory [37,38], the nearby or similar samples and the samples from the same global cluster should share similar labels. Therefore, the objective function of LP is defined as follows:…”
Section: Objective Function Of Ssrr-aglpmentioning
confidence: 99%
“…Hence, it is difficult to satisfy the requirement of for real-life sample [37]. Conversely, the unlabeled samples can easily be collected from the Internet, web chatting, digital camera of surveillance and so on [38]. As a result, it is necessary to utilize the information of unlabeled samples to improve the performance of ridge regression algorithm [39,40].…”
Section: Introductionmentioning
confidence: 99%