2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC) 2020
DOI: 10.1109/smc42975.2020.9282886
|View full text |Cite
|
Sign up to set email alerts
|

Why Squashing Functions in Multi-Layer Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 8 publications
0
7
0
Order By: Relevance
“…This model also seems to explain the great success of the rectified linear unit (ReLU). In [33], the authors elaborate on the empirical success of squashing functions in neural networks by showing that the formulae describing this family follow from natural symmetry requirements.…”
Section: Neural Network Based On Nilpotent Fuzzy Logic and Mcdm Toolsmentioning
confidence: 99%
“…This model also seems to explain the great success of the rectified linear unit (ReLU). In [33], the authors elaborate on the empirical success of squashing functions in neural networks by showing that the formulae describing this family follow from natural symmetry requirements.…”
Section: Neural Network Based On Nilpotent Fuzzy Logic and Mcdm Toolsmentioning
confidence: 99%
“…Observe that in this model we have implemented rectified linear layers. Recent investigations have demonstrated that rectified linear functions are the most effective in representing data processing in neural networks 5 , particularly for networks with many layers (Urenda et al, 2020). Additionally, the ordinal classification problem of the therapy length is, based on Kramer et al, handled as a regression problem with an additional postprocessing step (Kramer et al, 2001).…”
Section: Baseline Modelsmentioning
confidence: 99%
“…In mathematical terms, this means that the corresponding group must be finite-dimensional. It is known that under reasonable conditions, any finite-dimensional transformation group that contains all linear transformation contains only fractional-linear transformations, i.e., transformations of the type [6,7,10,11], etc.…”
Section: What Probability Distributions Satisfy This Invariance Requirement?mentioning
confidence: 99%