2014 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB) 2014
DOI: 10.1109/ccmb.2014.7020706
|View full text |Cite
|
Sign up to set email alerts
|

Distributed robust training of multilayer neural netwroks using normalized risk-averting error

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 17 publications
0
2
0
Order By: Relevance
“…One of the purposes of this work is to model highly nonlinear functions with as few neurons as possible to enable better generalization abilities. It can be observed from Table 1 that 55 is the smallest number of hidden neurons at which a small median and average errors E train (w) < 10 −3 suitable for modeling [18] are obtained. The network structures for the other problems are also determined in similar simulations.…”
Section: Function Approximation Problemsmentioning
confidence: 99%
“…One of the purposes of this work is to model highly nonlinear functions with as few neurons as possible to enable better generalization abilities. It can be observed from Table 1 that 55 is the smallest number of hidden neurons at which a small median and average errors E train (w) < 10 −3 suitable for modeling [18] are obtained. The network structures for the other problems are also determined in similar simulations.…”
Section: Function Approximation Problemsmentioning
confidence: 99%
“…The problem, called the local minimum problem in training DNNs, has plagued the DNN community since the 1980s 1,2 . DNNs trained with backpropagation are extensively utilized to solve various tasks in artificial intelligence fields for decades [3][4][5][6][7] . The computing power of DNNs is derived through its particularly distributed structure and the capability to learn and generalize.…”
Section: Introductionmentioning
confidence: 99%