9th International Conference on Artificial Neural Networks: ICANN '99 1999
DOI: 10.1049/cp:19991129
|View full text |Cite
|
Sign up to set email alerts
|

Neural networks with periodic and monotonic activation functions: a comparative study in classification problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
22
0

Year Published

2001
2001
2019
2019

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 41 publications
(22 citation statements)
references
References 0 publications
0
22
0
Order By: Relevance
“…Lack of a such a limit can lead to instability [31]. Periodic activation functions also introduce many local minima [28].…”
Section: Learning Periodic Functionsmentioning
confidence: 99%
“…Lack of a such a limit can lead to instability [31]. Periodic activation functions also introduce many local minima [28].…”
Section: Learning Periodic Functionsmentioning
confidence: 99%
“…We also constructed an MLP with a hidden layer of 24 sinusoidal units, as in [34]. Initial frequencies for BPW were randomly assigned to an interval [ − 3:5; 3:5], and the initial range for the coe cients was 0:0001.…”
Section: The Two Spirals Problemmentioning
confidence: 99%
“…Multiple valued neuron appears in the activation function and neural network where input and output value are both complex number with unitary length and this directly leads to consider sine and cosine function as the activation function in this case. And people have tried to compare training time between the periodic function and other activation function such as sigmoid [11]. It's interesting to see how the periodic functions are compared to more recently used activation function such as linear rectifier function.…”
Section: Introductionmentioning
confidence: 99%