2021
DOI: 10.1016/j.eswa.2021.115591
|View full text |Cite
|
Sign up to set email alerts
|

Incremental semi-supervised Extreme Learning Machine for Mixed data stream classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(7 citation statements)
references
References 45 publications
0
7
0
Order By: Relevance
“…In order to verify the effectiveness of the model, this paper sets up two comparative experiments: the first experiment compares the simulation results with the actual results, and the second experiment compares the results of LSTM-CA and ELM-CA. ELM-CA [61] trains the ELM model by collecting information about combustion and unburned areas and recording wind speed in the event of a fire. The probability is predicted by the action of elm and wind.…”
Section: The Results Of Lstm-ca and Elm-camentioning
confidence: 99%
“…In order to verify the effectiveness of the model, this paper sets up two comparative experiments: the first experiment compares the simulation results with the actual results, and the second experiment compares the results of LSTM-CA and ELM-CA. ELM-CA [61] trains the ELM model by collecting information about combustion and unburned areas and recording wind speed in the event of a fire. The probability is predicted by the action of elm and wind.…”
Section: The Results Of Lstm-ca and Elm-camentioning
confidence: 99%
“…By incorporating a substantial quantity of unlabeled samples lacking dominant variables into a limited number of labeled samples containing dominant variables, both the problem of multi-rate and the predictive accuracy of soft sensing models can be effectively addressed. This integration optimally utilizes the unlabeled samples, consequently resolving the challenge of multi-rate and enhancing the precision of prediction in soft sensing models [31,32]. In the selection of fusion weights, time is used as a variable, calculate its Pearson correlation coefficient, and the time correlation performance effectively reflects the degree of correlation between unlabeled and labeled samples.…”
Section: The Semi-supervised Learning Methods Based On Time Correlationmentioning
confidence: 99%
“…During this process, the model can continuously acquire new knowledge without retraining. Recently, incremental learning has been widely used in artificial immune algorithms, 32 incremental extreme learning machines, 33 incremental support vector machines (SVMs), 34 incremental artificial neural networks (NNs), and other fields. 35,36 In the above methods, inspired by the principle process of the vertebrate immune system and the need for self-organization and robustness of artificial intelligence systems, more and more researchers in the field of artificial intelligence focus on artificial immune algorithms.…”
Section: Introductionmentioning
confidence: 99%