2022
DOI: 10.3390/e24091303
|View full text |Cite
|
Sign up to set email alerts
|

An Asymmetric Contrastive Loss for Handling Imbalanced Datasets

Abstract: Contrastive learning is a representation learning method performed by contrasting a sample to other similar samples so that they are brought closely together, forming clusters in the feature space. The learning process is typically conducted using a two-stage training architecture, and it utilizes the contrastive loss (CL) for its feature learning. Contrastive learning has been shown to be quite successful in handling imbalanced datasets, in which some classes are overrepresented while some others are underrep… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…CNN is computationally efficient in that the features in one area of the image are often the same as those in another part of the image, which provides the use of the same weights to calculate activations on other regions of the image. Thus, the number of parameters, weights, and links to be trained is reduced [ 39 ].…”
Section: Methodsmentioning
confidence: 99%
“…CNN is computationally efficient in that the features in one area of the image are often the same as those in another part of the image, which provides the use of the same weights to calculate activations on other regions of the image. Thus, the number of parameters, weights, and links to be trained is reduced [ 39 ].…”
Section: Methodsmentioning
confidence: 99%