2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2020
DOI: 10.1109/cvprw50498.2020.00125
|View full text |Cite
|
Sign up to set email alerts
|

What is Happening Inside a Continual Learning Model? - A Representation-Based Evaluation of Representational Forgetting -

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…Similar to commonly observed forgetting on the output level, forgetting on the feature level can also happen in continual learning, although being more elusive for analysis [10]. Changes in the representation does not necessarily mean forgetting.…”
Section: Feature Forgettingmentioning
confidence: 62%
See 3 more Smart Citations
“…Similar to commonly observed forgetting on the output level, forgetting on the feature level can also happen in continual learning, although being more elusive for analysis [10]. Changes in the representation does not necessarily mean forgetting.…”
Section: Feature Forgettingmentioning
confidence: 62%
“…Besides being a natural extension of a common practice in representation evaluation, this protocol has the following advantages: 1) Consistency: the choice of evaluation tasks is unambiguous and consistent, independent of the model and the algorithm, unlike in general representation evaluation and in [10]. This makes the metric P rep good for comparing methods, even against non-continual learning algorithms.…”
Section: A Protocol For Evaluating Representation In Continual Learningmentioning
confidence: 99%
See 2 more Smart Citations