2020
DOI: 10.1007/978-3-030-64556-4_26
|View full text |Cite
|
Sign up to set email alerts
|

Offline Versus Online Triplet Mining Based on Extreme Distances of Histopathology Patches

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 19 publications
0
9
0
Order By: Relevance
“…Recently, the contrastive learning mechanism has demonstrated promising performance in self-supervised action recognition. The supervision signals of the contrastive learning paradigm are usually generated by a contrastive loss [19], such as InfoNCE [22], SimCLR [40], MoCo [21], and OTM [23]. These contrastive losses have been widely used in recent selfsupervised action recognition methods.…”
Section: Related Work a Self-supervised Skeleton-based Action Recogni...mentioning
confidence: 99%
See 2 more Smart Citations
“…Recently, the contrastive learning mechanism has demonstrated promising performance in self-supervised action recognition. The supervision signals of the contrastive learning paradigm are usually generated by a contrastive loss [19], such as InfoNCE [22], SimCLR [40], MoCo [21], and OTM [23]. These contrastive losses have been widely used in recent selfsupervised action recognition methods.…”
Section: Related Work a Self-supervised Skeleton-based Action Recogni...mentioning
confidence: 99%
“…AimCLR [15] adopts InfoNCE [22] as a contrastive loss to improve the model performance on downstream tasks. In this paper, we use OTM [23] to implement our DMMG.…”
Section: Related Work a Self-supervised Skeleton-based Action Recogni...mentioning
confidence: 99%
See 1 more Smart Citation
“…While the current batch is processed by the training framework on GPU we generate the next one on CPU. This part can be improved with various triplet generation techniques like hard mining [22,23], but this is not a topic of the current paper therefore we use the simplest random selection.…”
Section: Training 221 Batch Generationmentioning
confidence: 99%
“…showed [ 9 ] that triplet mining is useful, and can be performed efficiently. Online triplet mining—the method where triplets are selected right before the training step—is performing remarkably better than its counterpart ( offline mining), where embeddings are unaffected by the last steps [ 14 ].…”
Section: Introductionmentioning
confidence: 99%