International Joint Conference on Neural Networks 1989
DOI: 10.1109/ijcnn.1989.118644
|View full text |Cite
|
Sign up to set email alerts
|

Backpropagation separates when perceptrons do

Abstract: We consider in this paper the behavior of the least squares problem that arises when one attempts to train a feedforward net with no hidden neurons. It is assumed that the net has monotonic non-linear output units. Under the assumption that a training set is sepamble, that is that there is a set of achievable outputs for which the error is zero, we show that there are no non-global minima. More precisely, we assume that the error is of a threshold LMS type, in that the error function is zero for values 'beyond… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

1990
1990
2006
2006

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 3 publications
0
6
0
Order By: Relevance
“…If the target for a pattern of class 1 is 1 -f but the output is greater than 1 -E, then the output is clamped to 1 -f. Similarly for a pattern of class 2, if the target is td but the network output is less than f, then the output is clamped to f. The clamp is used to implement the modified penalty function suggested by Sontag and Sussmann [11]. They observe that backpropagation is less likely to get stuck in local minima when the output is clamped during training.…”
Section: Definitionsmentioning
confidence: 99%
“…If the target for a pattern of class 1 is 1 -f but the output is greater than 1 -E, then the output is clamped to 1 -f. Similarly for a pattern of class 2, if the target is td but the network output is less than f, then the output is clamped to f. The clamp is used to implement the modified penalty function suggested by Sontag and Sussmann [11]. They observe that backpropagation is less likely to get stuck in local minima when the output is clamped during training.…”
Section: Definitionsmentioning
confidence: 99%
“…This threshold-LMS error has been introduced by Sontag & Sussman (1989). This cost does not penalize outputs "beyond" the target values.…”
Section: -(O(t(q))-d-)mentioning
confidence: 99%
“…The target values were 0.9 for grammatical and 0.1 for ungrammatical strings. A LMS-threshold function was used for the error computation (Sontag & Sussman, 1989).…”
Section: Experimental Set Upmentioning
confidence: 99%
“…We remark that an error function similar to (5) has been considered by Sontag and Sussmann in [10], where, with reference to multilayer perceptron networks, the properties of its local minima have been analyzed. However, as far as we are aware, the use of such an error function for addressing the problem of network training has not yet been proposed.…”
Section: An Alternative Formulation Of the Training Problem For Pmentioning
confidence: 99%