2024
DOI: 10.1109/tnnls.2023.3292359
|View full text |Cite
|
Sign up to set email alerts
|

Overcoming Catastrophic Forgetting in Continual Learning by Exploring Eigenvalues of Hessian Matrix

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…However, the regularization-based CL method constructs a regularization-based term as an approximate loss function of previous tasks to limit the updating of parameters, which may cause the network parameters to not reflect the current new tasks in a timely manner. Most of the methods are based on heuristics, with no good theoretical understanding of the factors associated with catastrophic forgetting [47].…”
Section: Continuous Learning Methods Based On Regularizationmentioning
confidence: 99%
“…However, the regularization-based CL method constructs a regularization-based term as an approximate loss function of previous tasks to limit the updating of parameters, which may cause the network parameters to not reflect the current new tasks in a timely manner. Most of the methods are based on heuristics, with no good theoretical understanding of the factors associated with catastrophic forgetting [47].…”
Section: Continuous Learning Methods Based On Regularizationmentioning
confidence: 99%
“…This means they may face two potential problems: catastrophic forgetting (losing previous knowledge and skills) or interference (degrading new knowledge and skills) [162]. The brain can prevent forgetting and interference through sleep, rehearsal, and consolidation [159], while continuous learning techniques use methods such as regularization, replay, and distillation to mitigate them [171].…”
mentioning
confidence: 99%