2022
DOI: 10.48550/arxiv.2206.02574
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the duality between contrastive and non-contrastive self-supervised learning

Quentin Garrido,
Yubei Chen,
Adrien Bardes
et al.

Abstract: Recent approaches in self-supervised learning of image representations can be categorized into different families of methods and, in particular, can be divided into contrastive and non-contrastive approaches. While differences between the two families have been thoroughly discussed to motivate new approaches, we focus more on the theoretical similarities between them. By designing contrastive and non-contrastive criteria that can be related algebraically and shown to be equivalent under limited assumptions, we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(15 citation statements)
references
References 22 publications
0
15
0
Order By: Relevance
“…Various theoretical studies have also investigated non-contrastive methods for self-supervised learning [71,61,18,5,66,49,57,34]. Garrido et al [18] establishes the duality between contrastive and non-contrastive methods.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Various theoretical studies have also investigated non-contrastive methods for self-supervised learning [71,61,18,5,66,49,57,34]. Garrido et al [18] establishes the duality between contrastive and non-contrastive methods.…”
Section: Related Workmentioning
confidence: 99%
“…Various theoretical studies have also investigated non-contrastive methods for self-supervised learning [71,61,18,5,66,49,57,34]. Garrido et al [18] establishes the duality between contrastive and non-contrastive methods. Balestriero & LeCun [5] reveals the connections between variants of SimCLR, Barlow Twins, and VICReg to ISOMAP, Canonical Correlation Analysis, and Laplacian Eigenmaps, respectively.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A thorough discussion is beyond the scope of this work. We refer the curious readers to Garrido et al [2022] for a more general discussion on the duality between contrastive learning and non-contrastive learning.…”
Section: A Appendixmentioning
confidence: 99%
“…However, it is important to specify a pre-training objective function that induces good performance for the downstream tasks. Contrastive SSL methods, despite their early success, rely heavily on negative samples, extensive data augmentation, and large batch sizes (Jing et al, 2022;Garrido et al, 2022). Non-contrastive methods address these shortcomings, incorporating information theoretic principles through architectural innovations or regularization methods.…”
Section: Introductionmentioning
confidence: 99%