Proceedings of the Genetic and Evolutionary Computation Conference Companion 2022
DOI: 10.1145/3520304.3533949
|View full text |Cite
|
Sign up to set email alerts
|

Evolution of activation functions for deep learning-based image classification

Abstract: Activation functions (AFs) play a pivotal role in the performance of neural networks. The Rectified Linear Unit (ReLU) is currently the most commonly used AF. Several replacements to ReLU have been suggested but improvements have proven inconsistent. Some AFs exhibit better performance for specific tasks, but it is hard to know a priori how to select the appropriate one(s). Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-populati… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 25 publications
0
5
0
Order By: Relevance
“…Evolution may also be a solution for rendering models more robust. In [57] it was shown that combining different activation functions could be used to increase model accuracy; this approach might also be used for obtaining robustness.…”
Section: Discussion and Concluding Remarksmentioning
confidence: 99%
“…Evolution may also be a solution for rendering models more robust. In [57] it was shown that combining different activation functions could be used to increase model accuracy; this approach might also be used for obtaining robustness.…”
Section: Discussion and Concluding Remarksmentioning
confidence: 99%
“…Although the proposed method can construct activation functions that outperform traditional activation functions, like ReLU, due to the way in which solutions are represented, the method can only construct activation functions of a given structure. A coevolutionary algorithm to evolve new activation functions for standard fully-connected and convolutional neural networks is proposed in [52]. The authors use Cartesian GP to evolve new activation functions, which can evolve new activation functions of various structures.…”
Section: Related Workmentioning
confidence: 99%
“…To fix this problem, some modifications were made which resulted to some of the variants of the ReLU activation function. These activation functions include: Leaky ReLU [21], SELU [22], PReLU [23], GELU [24], ELU [25], RReLU [26], CELU [27] and RELU6 [28].…”
Section: Activation Functionsmentioning
confidence: 99%