2021
DOI: 10.48550/arxiv.2103.16940
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning with Memory-based Virtual Classes for Deep Metric Learning

Abstract: The core of deep metric learning (DML) involves learning visual similarities in high-dimensional embedding space. One of the main challenges is to generalize from seen classes of training data to unseen classes of test data. Recent works have focused on exploiting past embeddings to increase the number of instances for the seen classes. Such methods achieve performance improvement via augmentation, while the strong focus on seen classes still remains. This can be undesirable for DML, where training and test da… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 10 publications
0
1
0
Order By: Relevance
“…(Ye et al 2019;Li et al 2020a) apply it in unsupervised tasks to learn prototypes as non-parametric classifiers for all input instances. (Qiao et al 2021;Yang et al 2021;Ko, Gu, and Kim 2021) employ it in the few-shot tasks to maintain prototypes as centers for tail categories, preventing them from being overwhelmed by head categories. (Joseph et al 2021) uses prototypes to acquire discriminative features to distinguish unknown objects in the open world.…”
Section: Introductionmentioning
confidence: 99%
“…(Ye et al 2019;Li et al 2020a) apply it in unsupervised tasks to learn prototypes as non-parametric classifiers for all input instances. (Qiao et al 2021;Yang et al 2021;Ko, Gu, and Kim 2021) employ it in the few-shot tasks to maintain prototypes as centers for tail categories, preventing them from being overwhelmed by head categories. (Joseph et al 2021) uses prototypes to acquire discriminative features to distinguish unknown objects in the open world.…”
Section: Introductionmentioning
confidence: 99%