2020
DOI: 10.1007/s00521-020-05182-1
|View full text |Cite|
|
Sign up to set email alerts
|

“SPOCU”: scaled polynomial constant unit activation function

Abstract: We address the following problem: given a set of complex images or a large database, the numerical and computational complexity and quality of approximation for neural network may drastically differ from one activation function to another. A general novel methodology, scaled polynomial constant unit activation function ''SPOCU,'' is introduced and shown to work satisfactorily on a variety of problems. Moreover, we show that SPOCU can overcome already introduced activation functions with good properties, e.g., … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

1
21
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 72 publications
(24 citation statements)
references
References 22 publications
1
21
0
Order By: Relevance
“…Following the publication of our article [1], we have become aware that the legends of Fig. 8b should read as follows: generator h(Á) of the activation function S. Moreover, the following Fig.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations
“…Following the publication of our article [1], we have become aware that the legends of Fig. 8b should read as follows: generator h(Á) of the activation function S. Moreover, the following Fig.…”
Section: Introductionmentioning
confidence: 99%
“…!). Shortly after publication of our article [1], we have received questions from the community how to implement SPOCU, since it is not involved in current software packages, e.g. in Matlab or Python (Keras).…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…More recent activation functions, such as the scaled polynomial constant unit activation function (SPOCU)[22], could potentially result in a higher classification accuracy. However, for the application of uncertainty quantification, we found that ReLU activation function achieves similar results.…”
mentioning
confidence: 99%