2018
DOI: 10.1016/j.neucom.2018.08.028
|View full text |Cite
|
Sign up to set email alerts
|

Laplacian twin extreme learning machine for semi-supervised classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(11 citation statements)
references
References 23 publications
0
11
0
Order By: Relevance
“…In this case, if there is no (WindowLABEL, Timestamp) Index track consistent with the timestamp of SID, multi-indexing is applied to the smaller and larger two tracks relatively close to the timestamp. For example, if the timestamp of SID6 is 55, multiple indexing is applied to the tracks close to the SID6, which are (5,50) and (6,60). Since it is hard to present the absolute time of time-series data, the time is relatively presented with the use of timestamp.…”
Section: A Time Window Size Setting For Timing Constraintsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this case, if there is no (WindowLABEL, Timestamp) Index track consistent with the timestamp of SID, multi-indexing is applied to the smaller and larger two tracks relatively close to the timestamp. For example, if the timestamp of SID6 is 55, multiple indexing is applied to the tracks close to the SID6, which are (5,50) and (6,60). Since it is hard to present the absolute time of time-series data, the time is relatively presented with the use of timestamp.…”
Section: A Time Window Size Setting For Timing Constraintsmentioning
confidence: 99%
“…Unsupervised learning is the technique of allowing a computer to learn data without a label in order to find a latent rule [5]. Semi-supervised learning is in a combination of supervised learning and unsupervised learning [6]. Also, reinforcement learning is the technique of determining the order of actions according to rewards [7].…”
Section: Introductionmentioning
confidence: 99%
“…The literature [8,9] notes that ELM has better classification performance than support vector machine (SVM) [10]. Due to the good generalization ability of ELM, ELM has been widely used in pattern recognition [11][12][13][14][15].…”
Section: Introductionmentioning
confidence: 99%
“…Aiming at the parameter optimization of ELM and the optimization of single hidden layer activation function, many scholars have conducted research. Li et al [35] proposed using the kernel function in SVM instead of the connection weight matrix between the original hidden layer and the output layer in the ELM algorithm; li et al [36] proposed a new type of Laplacian bipolar to learning machine (LapTELM), enabling LapTELM to fully exploit the benefits of large numbers of unlabeled samples while preserving the learning power and efficiency of the double extreme learning machine (TELM); Fang et al [37] introduced a ELM's multimodal data hierarchical framework which demonstrated that ELM has better learning efficiency than gradient-based multimodal deep learning methods; shang et al [38] developed a classification and regression tree (CART) based on A new predictive model of the Extreme Learning Machine (EELM) method, which improved the accuracy of PM2.5 concentration prediction per hour. Ming et al [39] proposed two parallel changes of ELM including local data and In order to accurately predict the amount and proportion of China's renewable energy terminal power consumption, this paper proposes a combined forecasting model.…”
Section: Introductionmentioning
confidence: 99%