2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00516
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Similarity Loss With General Pair Weighting for Deep Metric Learning

Abstract: A family of loss functions built on pair-based computation have been proposed in the literature which provide a myriad of solutions for deep metric learning. In this paper, we provide a general weighting framework for understanding recent pair-based loss functions. Our contributions are three-fold: (1) we establish a General Pair Weighting (GPW) framework, which casts the sampling problem of deep metric learning into a unified view of pair weighting through gradient analysis, providing a powerful tool for unde… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
607
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 696 publications
(632 citation statements)
references
References 30 publications
1
607
1
Order By: Relevance
“…The advantage of N-pair loss will be lost in this situation. Unlike all other loss metrics in the literature, the authors in [102] benefited from multi-similarity loss while using self-similarity, negative relative similarity, and positive relative similarity under the general pair weighting framework to capture the similarity between samples. This loss takes both self-similarity and relative similarities into account, which make possible the model to gather and weight informative pair samples more efficiently.…”
Section: Loss Functions For Deep Metric Learningmentioning
confidence: 99%
“…The advantage of N-pair loss will be lost in this situation. Unlike all other loss metrics in the literature, the authors in [102] benefited from multi-similarity loss while using self-similarity, negative relative similarity, and positive relative similarity under the general pair weighting framework to capture the similarity between samples. This loss takes both self-similarity and relative similarities into account, which make possible the model to gather and weight informative pair samples more efficiently.…”
Section: Loss Functions For Deep Metric Learningmentioning
confidence: 99%
“…Only three samples are in a batch that participates in the training. N-pair loss [55] increases the number of negative samples that interact with the query sample to improve the performance of triplet loss. It takes advantage of all sample pairs in the mini-batch and learns more differentiated representations based on structural information between the data.…”
Section: N-pair Lossmentioning
confidence: 99%
“…DML has been a long-standing research hotspot in improving the performance of image retrieval [42][43][44][45][46]52]. There are two different research direction of DML which are clustering-based and pair-based structured losses.…”
Section: Deep Metric Learningmentioning
confidence: 99%
“…As a mass of structured losses [41][42][43][44][45][46][47] have obtained appreciable effectiveness in training networks to learn discriminative embedding features, we would like to make a brief review on the development of pair-based structured loss.…”
Section: Pair-based Structured Lossmentioning
confidence: 99%
See 1 more Smart Citation