2022
DOI: 10.1007/978-3-030-99736-6_26
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning Approaches for Building Cross-Language Dense Retrieval Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

1
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 19 publications
(13 citation statements)
references
References 27 publications
1
12
0
Order By: Relevance
“…Recently, dense retrieval models have been adapted to CLIR by replacing the encoder with a multilingual pretrained model, such as mBERT, XLM or XLM-R [3,27,32]. To utilize existing monolingual collections with a large number of relevance labels such as MS MARCO [2], dense retrievers with multilingual embeddings can be trained on such corpora with zero-shot transfer to CLIR by leveraging the multilinguality of the encoder [28,32].…”
Section: Background and Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Recently, dense retrieval models have been adapted to CLIR by replacing the encoder with a multilingual pretrained model, such as mBERT, XLM or XLM-R [3,27,32]. To utilize existing monolingual collections with a large number of relevance labels such as MS MARCO [2], dense retrievers with multilingual embeddings can be trained on such corpora with zero-shot transfer to CLIR by leveraging the multilinguality of the encoder [28,32].…”
Section: Background and Related Workmentioning
confidence: 99%
“…Recently, dense retrieval models have been adapted to CLIR by replacing the encoder with a multilingual pretrained model, such as mBERT, XLM or XLM-R [3,27,32]. To utilize existing monolingual collections with a large number of relevance labels such as MS MARCO [2], dense retrievers with multilingual embeddings can be trained on such corpora with zero-shot transfer to CLIR by leveraging the multilinguality of the encoder [28,32]. Alternatively, with the help of translation models, one can translate the monolingual training collection into the language pair of interest and train the retriever on it (a "translate-train" approach) [32,38].…”
Section: Background and Related Workmentioning
confidence: 99%
See 3 more Smart Citations