2023
DOI: 10.32604/cmc.2023.037028
|View full text |Cite
|
Sign up to set email alerts
|

A Universal Activation Function for Deep Learning

Abstract: Recently, deep learning has achieved remarkable results in fields that require human cognitive ability, learning ability, and reasoning ability. Activation functions are very important because they provide the ability of artificial neural networks to learn complex patterns through nonlinearity. Various activation functions are being studied to solve problems such as vanishing gradients and dying nodes that may occur in the deep learning process. However, it takes a lot of time and effort for researchers to use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…The OP-Tanish [61] gets high accuracy but a long calculation time. Hwang et al [62] proposed a universal activation function, which exhibits the properties of activation functions such as ReLU, Sigmoid, and Swish by adjusting three hyperparameters. The RAO [63] is another reconfigurable activation function exhibiting properties like ReLU, Sigmoid, and Tanh.…”
Section: Introductionmentioning
confidence: 99%
“…The OP-Tanish [61] gets high accuracy but a long calculation time. Hwang et al [62] proposed a universal activation function, which exhibits the properties of activation functions such as ReLU, Sigmoid, and Swish by adjusting three hyperparameters. The RAO [63] is another reconfigurable activation function exhibiting properties like ReLU, Sigmoid, and Tanh.…”
Section: Introductionmentioning
confidence: 99%