2020
DOI: 10.1109/access.2020.3035398
|View full text |Cite
|
Sign up to set email alerts
|

A Survey of Distributed and Parallel Extreme Learning Machine for Big Data

Abstract: Extreme learning machine (ELM) is characterized by good generalization performance, fast training speed and less human intervention. With the explosion of large amount of data generated on the Internet, the learning algorithm in the single-machine environment cannot meet the huge memory consumption of matrix computing, so the implement of distributed ELM algorithm has gradually become one of the research focuses. In view of the research significance and implementation value of distributed ELM, this paper first… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 89 publications
(106 reference statements)
0
6
0
Order By: Relevance
“…The authors showed it is possible to apply ELM in the construction of CAD systems, so the perspective is broad and deserves further study. Moreover, Wang et al [25] presented the research background of ELM and enhanced ELM, implementing a distributed ELM from matrix set operations. The authors highlight that the most computationally expensive computation is the MPGI multiplication operator.…”
Section: Elm Based On Metaheuristicsmentioning
confidence: 99%
See 2 more Smart Citations
“…The authors showed it is possible to apply ELM in the construction of CAD systems, so the perspective is broad and deserves further study. Moreover, Wang et al [25] presented the research background of ELM and enhanced ELM, implementing a distributed ELM from matrix set operations. The authors highlight that the most computationally expensive computation is the MPGI multiplication operator.…”
Section: Elm Based On Metaheuristicsmentioning
confidence: 99%
“…They conclude that the implementation of parallel and distributed ELM algorithms will become one of the key points of future research. At this point, it is important to highlight that, unlike the review by Wang et al [25], our work, in addition to reviewing the distributed and parallel algorithms developed for ELM models, emphasizes observing and describing the parallel architectures and tools used and the size of the databases, because these are very important aspects to consider when analyzing the performance of ELM variants for large-scale databases. In addition, considering the high time required for the computation of the MPGI, an updated review of the solution of systems of linear equations through parallel methods is made, because these are important developments to improve the current ELM variants.…”
Section: Elm Based On Metaheuristicsmentioning
confidence: 99%
See 1 more Smart Citation
“…More recently, Huang et al [18], [19] proposed the Extreme Learning Machine (ELM) model, which has gained attention [20]- [22]. As with single hidden layer feedforward neural networks (SLFNs), ELM randomly selects hidden nodes and analytically determines the output weights.…”
Section: Introductionmentioning
confidence: 99%
“…In order to cater for various application scenarios, many variants of ELM have been proposed, such as convex ELM [7], ordinal ELM [8], semisupervised ELM [9], Bayesian ELM [10], voting based ELM [11], PCA-ELM [12], robust ELM [13], microgenetic ELM [14], fuzzy ELM [15], hierarchical ELM [16], kernel ELM [17], ELM for multilayer perceptron [18], stacked ELM [19], wavelet ELM [20], graph embedded ELM [21], ELM for interval neural networks [22], memetic ELM [23], and ELM with binary output layer [24], ELM for imbalance learning [25], and ELM for residual learning [26]. The readers can refer to [27]- [29] for a thorough review of the essence and trends of ELM. Recently online sequential ELM has been applied to various fields such as fault diagnosis [30] and air quality forecasting [31].…”
Section: Introductionmentioning
confidence: 99%