2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.00640
|View full text |Cite
|
Sign up to set email alerts
|

Towards Backward-Compatible Representation Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
122
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 73 publications
(123 citation statements)
references
References 31 publications
1
122
0
Order By: Relevance
“…The backward compatibility representation learning first comes into sight in Shen et al (2020) on learning inter-operabile visual embeddings for image retrieval tasks. Later, Yan et al (2020) formalize the model update regression problem in machine learning and explore solutions on image classification tasks.…”
Section: Model Update Regression and Solutionsmentioning
confidence: 99%
See 1 more Smart Citation
“…The backward compatibility representation learning first comes into sight in Shen et al (2020) on learning inter-operabile visual embeddings for image retrieval tasks. Later, Yan et al (2020) formalize the model update regression problem in machine learning and explore solutions on image classification tasks.…”
Section: Model Update Regression and Solutionsmentioning
confidence: 99%
“…The model regression issue in deep learning first comes into sight in Shen et al (2020), where they inspect compatible representation learning for image retrieval. Yan et al (2020) proposed the positive-congruent training (PCT) for image classification that minimizes prediction errors and model regression at the same time.…”
Section: Introductionmentioning
confidence: 99%
“…Unlike other tasks, parameters of the last classifier for a face recognition model are crucial for recognition performance but strongly associated with privacy. These parameters can be regarded as mean embeddings of identities (Wang et al, 2018;, 2021b; Shen et al, 2020) (also called as class centers), from where individual privacy could be spied out as studied by plenty of works (Kumar Jindal et al, 2018;Boddeti, 2018;Mai et al, 2020;Dusmanu et al, 2021). That prevents the FL approach from broadcasting the whole model among clients and the central server, and consequently leads to conflicts in the aggregation of local updates.…”
Section: Introductionmentioning
confidence: 99%
“…To harvest the reward of the new model immediately, Shen et al (2020) introduces the compatible representation learning to train the new model with backward compatibility constraints, so that queries encoded by the new model can be directly indexed by the old gallery features. Meanwhile, as the new features and old features are interchangeable with each other, the gallery images can be backfilled on-the-fly, and the retrieval performances would be gradually improved to approach the optimal accuracy of the new model, dubbed as hot-refresh model upgrades (see Figure 1 (a)).…”
Section: Introductionmentioning
confidence: 99%
“…Meanwhile, as the new features and old features are interchangeable with each other, the gallery images can be backfilled on-the-fly, and the retrieval performances would be gradually improved to approach the optimal accuracy of the new model, dubbed as hot-refresh model upgrades (see Figure 1 (a)). Although existing compatible training methods (Meng et al, 2021;Shen et al, 2020) make it possible to upgrade the model in a hot-refresh manner, they still face the challenge of model regression (Yan et al, 2020;Li & Hoiem, 2018), which is actually caused by negative flips, i.e., queries correctly indexed by the old model are incorrectly recognized by the new model (see Figure 1 (b)). We claimed that, during the procedure of hot-refresh model upgrades, the negative flips occur when the new-to-new similarities between negative query-gallery pairs are larger than the new-to-old similarities between compatible positive query-gallery pairs.…”
Section: Introductionmentioning
confidence: 99%