2021
DOI: 10.1016/j.eswa.2021.114805
|View full text |Cite
|
Sign up to set email alerts
|

RSigELU: A nonlinear activation function for deep neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
31
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 73 publications
(32 citation statements)
references
References 18 publications
0
31
0
1
Order By: Relevance
“…However, the ReLU activation function is faces the problem of negative region. Since negative values are set to zero, the derivative of the values cannot be taken and the learning process slows down [4,18]. One of the biggest advantages of the ReLU activation function is that the computational load is low compared to other functions and it can be widely used in multi-layered architectures.…”
Section: Relu Activation Functionmentioning
confidence: 99%
See 2 more Smart Citations
“…However, the ReLU activation function is faces the problem of negative region. Since negative values are set to zero, the derivative of the values cannot be taken and the learning process slows down [4,18]. One of the biggest advantages of the ReLU activation function is that the computational load is low compared to other functions and it can be widely used in multi-layered architectures.…”
Section: Relu Activation Functionmentioning
confidence: 99%
“…Among the non-linear functions, the parameterized activation function is the functions that allow us to work actively on deep learning architectures thanks to the parameter value. Parametric activation functions used in the literature are expressed as LReLU, ELU, SELU, RSigELU [4,[8][9][10][11][12].…”
Section: Parametric Activation Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…In the formula, g j v,c represents the weight of neuron j in the connection layer; I j v represents the offset of neuron j; and N represents the number of neurons. Input the value of neuron j into a nonlinear activation function z [18]:…”
Section: Feature Recognition Of Artistic Visual Imagementioning
confidence: 99%
“…Based on the sampling law, increase the spatial resolution of the artistic visual image according to twice the range [20], and the original viewpoint value will be affected and changed due to the increase of high-frequency component. At this time, use formula (18) to shift the control point:…”
Section: Defolding Of Artistic Visual Imagesmentioning
confidence: 99%