2022
DOI: 10.48550/arxiv.2211.08769
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

RetroMAE v2: Duplex Masked Auto-Encoder For Pre-Training Retrieval-Oriented Language Models

Abstract: To better support retrieval applications such as web search and question answering, growing effort is made to develop retrieval-oriented language models (Gao and Callan, 2021;Xiao et al., 2022a). Most of the existing works focus on improving the semantic representation capability for the contextualized embedding of [CLS] token. However, recent study shows that the ordinary tokens besides [CLS] may provide extra information, which helps to produce a better representation effect (Lin et al., 2022). As such, it's… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 21 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?