2022
DOI: 10.48550/arxiv.2205.12035
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked Auto-Encoder

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(12 citation statements)
references
References 0 publications
0
12
0
Order By: Relevance
“…Later on, the auto-encoding based pre-training algorithms receive growing interests: the input sentences are encoded into embeddings and reconstructed back to the original sentences (Lu et al, 2021;. The recently proposed methods, such as SimLM and RetroMAE (Xiao et al, 2022a), extend the previous auto-encoding framework by upgrading the encoding and decoding mechanisms, which substantially improves the quality of deep semantic retrieval.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Later on, the auto-encoding based pre-training algorithms receive growing interests: the input sentences are encoded into embeddings and reconstructed back to the original sentences (Lu et al, 2021;. The recently proposed methods, such as SimLM and RetroMAE (Xiao et al, 2022a), extend the previous auto-encoding framework by upgrading the encoding and decoding mechanisms, which substantially improves the quality of deep semantic retrieval.…”
Section: Related Workmentioning
confidence: 99%
“…Later on, auto-encoding is found to be more effective Lu et al, 2021), where the language models are learned to reconstruct the input based on the generated embeddings. The recent work RetroMAE (Xiao et al, 2022a) extends the previous auto-encoding methods by introducing the enhanced encoding and decoding mechanisms, which leads to remarkable improvements on general retrieval benchmarks.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations