2021
DOI: 10.48550/arxiv.2105.14614
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Evolution of Activation Functions: An Empirical Investigation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Their results demonstrate that by replacing the standard ReLU activation function with these newly evolved functions improves the performance of neural networks on the CIFAR datasets. A similar approach is also investigated in [50], where the authors apply genetic programming to search for new activation functions. Through their experimental analysis, the authors demonstrate that their new activation functions outperform several standard functions on multivariate classification problems.…”
Section: Related Workmentioning
confidence: 99%
“…Their results demonstrate that by replacing the standard ReLU activation function with these newly evolved functions improves the performance of neural networks on the CIFAR datasets. A similar approach is also investigated in [50], where the authors apply genetic programming to search for new activation functions. Through their experimental analysis, the authors demonstrate that their new activation functions outperform several standard functions on multivariate classification problems.…”
Section: Related Workmentioning
confidence: 99%