2016
DOI: 10.1016/j.neucom.2016.05.039
|View full text |Cite
|
Sign up to set email alerts
|

Hessian semi-supervised extreme learning machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 37 publications
0
8
0
Order By: Relevance
“…It should be noted that the neural network trained by the Hessian (H) in [17][18][19][20] and using (15), (18), (14), (9), (10) had l = 6 neurons in its hidden layer, a tuning factor of α = 0.0004, and a number of epochs of e = 40.…”
Section: Results Of the Comparisonmentioning
confidence: 99%
See 2 more Smart Citations
“…It should be noted that the neural network trained by the Hessian (H) in [17][18][19][20] and using (15), (18), (14), (9), (10) had l = 6 neurons in its hidden layer, a tuning factor of α = 0.0004, and a number of epochs of e = 40.…”
Section: Results Of the Comparisonmentioning
confidence: 99%
“…In this section, we compare steepest descent (SD), steepest descent with mini-batches (SDMB) from [9][10][11][12], the Hessian (H) from [17][18][19][20], and the Hessian with mini-batches (HMB) from this investigation for electrical demand prediction. The goal of these algorithms is that the neural network output q l must reach the target t l as soon as possible.…”
Section: Comparisonsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, Hessian regularization is not robust for many semisupervised tasks. Especially, due to the result of sampling datasets is usually not dense which lead to inaccurate estimation [26]. In this section, we present the LHRSS-ELM algorithm in detail, which integrates the Laplacian and Hessian regularization term that takes advantage of manifold structure of the data space to improve the performance of traditional ELM.…”
Section: B Lhrss-elm Formulationmentioning
confidence: 99%
“…Laplacian regularization is one of the most popular manifold regularization, utilizing graph Laplacian to determine the geometry of the underlying manifold, has been successful used in semi-supervised tasks [19], [34], [35], [37], [38]. However, if only a few of labeled data available that the performance will be worsen due to lacking of extrapolating power, biased the solution towards a constant function and cannot preserve the local topology architecture [26], [25]. Another regularization manifold called Hessian regularization can make the learned functions whose values vary linearly along the data manifold, while the Hessian operator is time-consuming and not have efficient results, and it is not robust and feasible in computation cost.…”
Section: Introductionmentioning
confidence: 99%