2006
DOI: 10.1007/s11063-006-9014-9
|View full text |Cite
|
Sign up to set email alerts
|

CB3: An Adaptive Error Function for Backpropagation Training

Abstract: Effective backpropagation training of multi-layer perceptrons depends on the incorporation of an appropriate error or objective function. Classification-Based (CB) error functions are heuristic approaches that attempt to guide the network directly to correct pattern classification rather than using common error minimization heuristics, such as Sum-Squared Error (SSE) and Cross-Entropy (CE), which do not explicitly minimize classification error. This work presents CB3, a novel CB approach that learns the error … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
12
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 8 publications
0
12
0
Order By: Relevance
“…In so doing, they perform relatively minimal updates to network parameters in order to discourage premature weight saturation and overfitting. CB3 is a CB approach that learns the error function to be used while training [110]. This is accomplished by learning pattern confidence margins during training, which are used to dynamically set output target values for each training pattern.…”
Section: Criterion Functionsmentioning
confidence: 99%
“…In so doing, they perform relatively minimal updates to network parameters in order to discourage premature weight saturation and overfitting. CB3 is a CB approach that learns the error function to be used while training [110]. This is accomplished by learning pattern confidence margins during training, which are used to dynamically set output target values for each training pattern.…”
Section: Criterion Functionsmentioning
confidence: 99%
“…In the meantime, researchers have identified that proper cost function is being an important factor to improve the performance of standard BP in terms of convergence speed [3]; [4]; [5]; [6]; [7]; [8]; [9]; [10]; [11]; [12]; [13]; [14], higher accuracy [15]; [16], and to overcome the problems of getting stuck into local minima [15]; [6]; [9]; [10]; [17]; [14].…”
Section: Introductionmentioning
confidence: 99%
“…It has been observed that, MSE cost function has some drawbacks such as incorrect saturation and tend to trap in local minima, resulting in slow convergence and poor performance [16]. MSE gives more emphasis on reducing the larger errors as compared to smaller errors due to the squaring that takes place.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, there is a large difference between making a parameter chaotic and chaos injection. Among other approaches, noise injection (NI) into the network [10], the addition of a penalty term in the cost function [1], a change of activation function [11] and a change of cost function [12] are worth mentioning. However, these approaches are greedy and they are only loosely motivated by biological counterparts.…”
mentioning
confidence: 99%