2014 10th International Computer Engineering Conference (ICENCO) 2014
DOI: 10.1109/icenco.2014.7050434
|View full text |Cite
|
Sign up to set email alerts
|

M-Estimators based activation functions for robust neural network learning

Abstract: Multi-layer feed-forward neural networks has been proven to be very successful in many applications, as industrial modeling, classification and function approximations. Training data containing outliers are often a problem for these supervised neural networks learning methods that may not always come up with acceptable performance. Robust neural network learning algorithms are often applied to deal with the problem of gross errors and outliers. Recently many researches exploited M estimators as performance fun… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…The dataset is then contaminated in the x-y axis by Gaussian noise with a mean of zero and a standard deviation of 0.1, G2~N (0, 0.1). A variable percentage, ε, of data was randomly selected and then replaced with probability, ε, by background noise uniformly distributed in the specific range [24,26,29,31,32].…”
Section: Simulation Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The dataset is then contaminated in the x-y axis by Gaussian noise with a mean of zero and a standard deviation of 0.1, G2~N (0, 0.1). A variable percentage, ε, of data was randomly selected and then replaced with probability, ε, by background noise uniformly distributed in the specific range [24,26,29,31,32].…”
Section: Simulation Resultsmentioning
confidence: 99%
“…The research [23] presented a set of robust, statistical M-estimators as a replacement for the conventional MSE loss function using good, noise-free information. Furthermore, new transfer functions that depend on robust statistical m-estimators proposed alternatives to traditional transfer functions using a dataset containing outliers [24].…”
Section: Relevant Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Gomes et al [32] analyzed the performance of different activation functions in NN to accurately forecast time series data. Later, Essai and Ellah [33] performed experiments using robust M-estimators objective functions as activation functions which outperformed the activation functions used earlier in the literature. Freire and Barreto [34] used the idea of batch intrinsic plasticity (BIP) to maximize hidden layer information combined with robust estimation of the output weights.…”
Section: Introductionmentioning
confidence: 99%