Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2022
DOI: 10.18653/v1/2022.naacl-main.284
|View full text |Cite
|
Sign up to set email alerts
|

EASE: Entity-Aware Contrastive Learning of Sentence Embedding

Abstract: We present EASE, a novel method for learning sentence embeddings via contrastive learning between sentences and their related entities. The advantage of using entity supervision is twofold: (1) entities have been shown to be a strong indicator of text semantics and thus should provide rich training signals for sentence embeddings; (2) entities are defined independently of languages and thus offer useful cross-lingual alignment supervision. We evaluate EASE against other unsupervised models both in monolingual … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…SimSCE (Tianyu Gao, 2021) utilizes the dropout in the pretrained word encoder and is proven to be an efficient way of augmentation. Leveraging the foundational concepts of SimCSE, a plethora of subsequent research endeavors have sought to enhance this framework through the incorporation of advanced auxiliary training objectives (Chuang et al, 2022;Nishikawa et al, 2022;Zhou et al, 2023;Wu et al, 2022), and (Chanchani and Huang, 2023) recently proposes maximizing alignment between texts and composition of their phrasal constituents.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…SimSCE (Tianyu Gao, 2021) utilizes the dropout in the pretrained word encoder and is proven to be an efficient way of augmentation. Leveraging the foundational concepts of SimCSE, a plethora of subsequent research endeavors have sought to enhance this framework through the incorporation of advanced auxiliary training objectives (Chuang et al, 2022;Nishikawa et al, 2022;Zhou et al, 2023;Wu et al, 2022), and (Chanchani and Huang, 2023) recently proposes maximizing alignment between texts and composition of their phrasal constituents.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…In order to alleviate the need for an annotated dataset, Gao et al (2021); Liu et al (2021) proposed a simple contrastive learning framework that used dropout noise within transformer layers to generate positive pairs. Nishikawa et al (2022) proposed a contrastive learning method for learning sentence embeddings between sentences and their related entities sampled from Wikidata.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…Meanwhile, SNCSE acquires word vectors through cue learning and syntactic parsing using spacy and constructs soft negative examples. ease was proposed by Nishikawa S et al [11] to enhance the learning of sentence vectors by using entity information. Ease constructs positive samples by hyperlinking entities from Wikipedia, and negative entities need to have the same type as positive entities and cannot be on the same Wikipedia page.…”
Section: Related Workmentioning
confidence: 99%