ISSPA '99. Proceedings of the Fifth International Symposium on Signal Processing and Its Applications (IEEE Cat. No.99EX359)
DOI: 10.1109/isspa.1999.818191
|View full text |Cite
|
Sign up to set email alerts
|

An analysis of the exponentiated gradient descent algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…The key challenge of CIL is that discarding old data is especially catastrophic for deep-learning models, as they are in the data-driven and end-to-end representation learning nature. The network parameters tuned by the old data will be overridden by the SGD optimizer using new data [11,33], causing a drastic performance dropcatastrophic forgetting [36,10,28] -on the old classes.…”
Section: Introductionmentioning
confidence: 99%
“…The key challenge of CIL is that discarding old data is especially catastrophic for deep-learning models, as they are in the data-driven and end-to-end representation learning nature. The network parameters tuned by the old data will be overridden by the SGD optimizer using new data [11,33], causing a drastic performance dropcatastrophic forgetting [36,10,28] -on the old classes.…”
Section: Introductionmentioning
confidence: 99%
“…The key challenge of CIL is that discarding old data is especially catastrophic for deep-learning models, as they are in the data-driven and end-to-end representation learning nature. The network parameters tuned by the old data will be overridden by the SGD optimizer using new data [138,139],…”
Section: Introductionmentioning
confidence: 99%