2023
DOI: 10.3233/faia230584
|View full text |Cite
|
Sign up to set email alerts
|

Identical and Fraternal Twins: Fine-Grained Semantic Contrastive Learning of Sentence Representations

Qingfa Xiao,
Shuangyin Li,
Lei Chen

Abstract: The enhancement of unsupervised learning of sentence representations has been significantly achieved by the utility of contrastive learning. This approach clusters the augmented positive instance with the anchor instance to create a desired embedding space. However, relying solely on the contrastive objective can result in sub-optimal outcomes due to its inability to differentiate subtle semantic variations between positive pairs. Specifically, common data augmentation techniques frequently introduce semantic … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…Employing one prompt for all disciplines may result in a uniform semantic shift and a convergence of passage representations in a restricted region of the embedding space, as depicted in Figure 1. This semantic space collapse Gao et al, 2019;Xiao et al, 2023) can blur the distinction between relevant and irrelevant passages, potentially masking irrelevant passages amidst relevant ones. Therefore, prompt generation is pivotal for this semantically nuanced task.…”
Section: Introductionmentioning
confidence: 99%
“…Employing one prompt for all disciplines may result in a uniform semantic shift and a convergence of passage representations in a restricted region of the embedding space, as depicted in Figure 1. This semantic space collapse Gao et al, 2019;Xiao et al, 2023) can blur the distinction between relevant and irrelevant passages, potentially masking irrelevant passages amidst relevant ones. Therefore, prompt generation is pivotal for this semantically nuanced task.…”
Section: Introductionmentioning
confidence: 99%