2021
DOI: 10.1016/j.eswa.2020.113977
|View full text |Cite
|
Sign up to set email alerts
|

New activation functions for single layer feedforward neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 46 publications
(15 citation statements)
references
References 10 publications
0
14
0
1
Order By: Relevance
“…For example, when it is close to the saturation zone, the transformation is too slow and the derivative tends to zero, which causes a loss of information and disappearance of the gradient (Oostwal et al., 2021). Moreover, the output of the sigmoid function is not zero‐centered (Kocak & Siray, 2021). When the data of the input neuron is positive, the gradient of the sigmoid function will be all positive or all negative in the process of backpropagation, which will cause a z‐shaped drop when the gradient descent weight is updated.…”
Section: Remote Sensing Estimation Methods For Water Qualitymentioning
confidence: 99%
“…For example, when it is close to the saturation zone, the transformation is too slow and the derivative tends to zero, which causes a loss of information and disappearance of the gradient (Oostwal et al., 2021). Moreover, the output of the sigmoid function is not zero‐centered (Kocak & Siray, 2021). When the data of the input neuron is positive, the gradient of the sigmoid function will be all positive or all negative in the process of backpropagation, which will cause a z‐shaped drop when the gradient descent weight is updated.…”
Section: Remote Sensing Estimation Methods For Water Qualitymentioning
confidence: 99%
“…Ding ve arkadaşları [8] Koçak ve Şiray [16] tarafından yapılan bir çalışmayla bilinen aktivasyon fonksiyonlarının avantajlarını birleştiren ve onlardan daha iyi başarım gösteren bazı yeni aktivasyon fonksiyonları tanıtılmıştır. Bu amaçla, generalized swish, meanswish, ReLU-swish, triple-state swish, sigmoidalgebraic, triple-state sigmoid, exponential swish, sinc-sigmoid olarak adlandırılan ve sigmoid'in türevi olan bazı yeni aktivasyon fonksiyonları önerilmiştir.…”
Section: öNceki çAlışmalarunclassified
“…Additionally, note that even small variations in the architecture of a particular activation function might deliver a better network performance (Koçak and Üstündağ Şiray, 2021). Thus, exact or very close representation of an activation function does not ensure a better network performance.…”
Section: Case Study Bmentioning
confidence: 99%