2020
DOI: 10.48550/arxiv.2011.00362
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Survey on Contrastive Self-supervised Learning

Abstract: Self-supervised learning has gained popularity because of its ability to avoid the cost of annotating large-scale datasets. It is capable of adopting self-defined pseudo labels as supervision and use the learned representations for several downstream tasks. Specifically, contrastive learning has recently become a dominant component in self-supervised learning methods for computer vision, natural language processing (NLP), and other domains. It aims at embedding augmented versions of the same sample close to ea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 25 publications
0
18
0
Order By: Relevance
“…It has recently been reported that the performance of a model can be improved by training multiple tasks in the field of self-supervised learning [27]. As the task completion time is an additional task that can be provided for almost any task and it can be treated as a subtask in self-supervised learning, self-supervised learning can be achieved in almost any task using the proposed framework.…”
Section: B Examination Of Task Completion Time Repeatability For Comp...mentioning
confidence: 99%
“…It has recently been reported that the performance of a model can be improved by training multiple tasks in the field of self-supervised learning [27]. As the task completion time is an additional task that can be provided for almost any task and it can be treated as a subtask in self-supervised learning, self-supervised learning can be achieved in almost any task using the proposed framework.…”
Section: B Examination Of Task Completion Time Repeatability For Comp...mentioning
confidence: 99%
“…Extending from TL, self-supervised learning (SSL) [27,28] is a technique where the pre-training stage uses only unlabeled data. Specifically, "pretext tasks" like image rotation prediction [29] and jigsaw puzzle solving [30] are invented for the data to provide its own supervision.…”
Section: Introductionmentioning
confidence: 99%
“…Specifically, "pretext tasks" like image rotation prediction [29] and jigsaw puzzle solving [30] are invented for the data to provide its own supervision. In particular, contrastive SSL [28] (or contrastive learning) is an increasingly popular technique where the pretext task is constructed as contrasting between two variations of a sample and other samples, where variations are derived using image transformations. The goal is for the pre-trained model to output embeddings where similar (differing) instances are closer (further) in the embedding metric space; this has proven to be highly effective for CV downstream tasks like classification, segmentation and detection.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, selfsupervised methods especially contrastive learning plays a vital role in the progress of deep learning due to data supervised itself. Besides, it can also alleviate the generalization error, spurious correlations and adversarial attacks [Jaiswal et al, 2021]. This kind of technology is applied in many fields.…”
Section: Introductionmentioning
confidence: 99%