2012
DOI: 10.5392/ijoc.2012.8.4.030
|View full text |Cite
|
Sign up to set email alerts
|

Contour Plots of Objective Functions for Feed-Forward Neural Networks

Abstract: Error surfaces provide us with very important information for training of feed-forward neural networks (FNNs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
4
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 19 publications
(27 reference statements)
1
4
0
Order By: Relevance
“…And, nCE error function with n=4 shows the optimal output more flat than CE and nCE with n=2 cases. The two-dimensional contour plots of CE and nCE error functions also show the same property [17]. So, we can argue that the property of divergence measures derived from CE and nCE coincides with the two-dimensional contour plot of CE and nCE error function in [17] and optimal outputs in [6] and [18].…”
Section: New Divergenc Measure From the N-th Order Extension Of Crosssupporting
confidence: 56%
“…And, nCE error function with n=4 shows the optimal output more flat than CE and nCE with n=2 cases. The two-dimensional contour plots of CE and nCE error functions also show the same property [17]. So, we can argue that the property of divergence measures derived from CE and nCE coincides with the two-dimensional contour plot of CE and nCE error function in [17] and optimal outputs in [6] and [18].…”
Section: New Divergenc Measure From the N-th Order Extension Of Crosssupporting
confidence: 56%
“…Since the path prediction in this paper is realized through the chain prediction between nodes, every prediction can actually be regarded as a classification problem. By comparing common cost functions such as mean squared error, root mean squared error, mean absolute error and cross entropy [ 57 , 58 , 59 ], the cross entropy, which is appropriate for solving classification problems, is selected as the cost function: where, N represents the total number of samples, p ( x ) refers to the probability of the true distribution, and q ( x ) refers to the probability estimate calculated through the model. The performance of cost function is affected by the total number of the samples, which should be taken into account for accurate result.…”
Section: Methodsmentioning
confidence: 99%
“…Since the path prediction in this paper is realized through the chain prediction between nodes, every prediction can actually be regarded as a classification problem. By comparing common cost functions such as mean squared error, root mean squared error, mean absolute error and cross entropy [57][58][59], the cross entropy, which is appropriate for solving classification problems, is selected as the cost function:…”
Section: Path Planning Applying Bp Neural Networkmentioning
confidence: 99%
“…For real applications of FNNs, how to train FNNs is still a challenging problem and various objective functions have been proposed to improve the performance of FNNs [5]- [8]. Since these objective functions have peculiar properties, they can be compared by diverse ways [9]- [13].…”
Section: Introductionmentioning
confidence: 99%
“…These results show the reason why some objective functions have problems of over-fitting and slow convergence of learning. Also, the contours of objective functions were derived in a two-dimensional space [9], which can give us more informative comparison results for training of FNNs.…”
Section: Introductionmentioning
confidence: 99%