Findings of the Association for Computational Linguistics: EMNLP 2020 2020
DOI: 10.18653/v1/2020.findings-emnlp.424
|View full text |Cite
|
Sign up to set email alerts
|

BERT-QE: Contextualized Query Expansion for Document Re-ranking

Abstract: Query expansion aims to mitigate the mismatch between the language used in a query and in a document. However, query expansion methods can suffer from introducing non-relevant information when expanding the query. To bridge this gap, inspired by recent advances in applying contextualized models like BERT to the document retrieval task, this paper proposes a novel query expansion model that leverages the strength of the BERT model to select relevant document chunks for expansion. In evaluation on the standard T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
47
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 56 publications
(47 citation statements)
references
References 26 publications
0
47
0
Order By: Relevance
“…However the use of BERT models directly within the pseudo-relevance feedback mechanism has seen comparatively little use in the literature. The current approaches leveraging the BERT contextualised embeddings for PRF are Neural PRF [18], BERT-QE [37] and CEQE [23].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However the use of BERT models directly within the pseudo-relevance feedback mechanism has seen comparatively little use in the literature. The current approaches leveraging the BERT contextualised embeddings for PRF are Neural PRF [18], BERT-QE [37] and CEQE [23].…”
Section: Related Workmentioning
confidence: 99%
“…feedback chunks that are extracted from the top-ranked feedback documents. This results in an expensive application of many BERT computations -approximately 11× as many GPU operations than a simple BERT reranker [37]. Both Neural PRF and BERT-QE approaches leverage contextualised language models to rerank an initial ranking of documents retrieved by a preliminary sparse retrieval system.…”
Section: Related Workmentioning
confidence: 99%
“…An end-to-end neural PRF model (NPRF) proposed by Li et al [14] uses a combination of models to compare document summaries and compute document relevance scores for feedback and achieves limited improvement while only using bag-of-words neural models. Later work combining BERT with a NPRF framework [41] illustrated the importance of an effective first-stage ranking method. A complementary vein of work [23] uses generative approaches to perform document expansion by predicting questions to add to document.…”
Section: Supervised Expansionmentioning
confidence: 99%
“…Other researchers have also suggested that PRF can use other knowledge, such as Wikipedia. The re-ranking of the PRF results is also carried out by other researchers using machine learning-classification methods [23], BERT [14][15][16], Multi-Stage using History, PRF original, and BERT [24].…”
Section: Related Workmentioning
confidence: 99%
“…BERT is a method widely used in recent research because the improvement of results in performance. However, previous studies used BERT as document ranking method to replace the original PRF that using TFIDF term weighting [14][15][16]. The document results consist other contexts that have semantic similarities and ignore documents that are statistically similar.…”
Section: Introductionmentioning
confidence: 99%