2019
DOI: 10.1007/s11042-019-7233-0
|View full text |Cite
|
Sign up to set email alerts
|

Deep extreme learning machine with leaky rectified linear unit for multiclass classification of pathological brain images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(19 citation statements)
references
References 35 publications
0
19
0
Order By: Relevance
“…For the last step, viz., the NLAF β , it usually selects the rectified linear unit (ReLU) function [33] . Suppose f ij is the entry of the matrix F , we have …”
Section: Methodsmentioning
confidence: 99%
“…For the last step, viz., the NLAF β , it usually selects the rectified linear unit (ReLU) function [33] . Suppose f ij is the entry of the matrix F , we have …”
Section: Methodsmentioning
confidence: 99%
“…They also developed a new deep network based on GELM-AE which converges faster than conventional deep networks. Nayak and Das [119]used ELM-AE to form a multilayer ELM and employed leaky rectified linear unit (LReLU) as activation function, which can be written as…”
Section: Elm For Semi-supervised and Unsupervised Learningmentioning
confidence: 99%
“…So, concluding the restriction condition, the mathematical expression of SVM's goal is written as below [119] Leaky rectified linear unit was used in multilayer ELM.…”
Section: Elm Vs Support Vector Machine (Svm)mentioning
confidence: 99%
“…First, CNN is able to improve capability and consistency in utilising picture functionality using deep architecture. Second, there has been a great deal of improvement in regularisation, including standardisation of the batch [45], LReLU [46] and residual learning [47]. These approaches allow CNN to accelerate and optimise the training of denoising networkefficiently.…”
Section: Denoising Networkmentioning
confidence: 99%