2020
DOI: 10.48550/arxiv.2009.13815
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Retrieval for Question Answering with Cross-Attention Supervised Data Augmentation

Abstract: Neural models that independently project questions and answers into a shared embedding space allow for efficient continuous space retrieval from large corpora. Independently computing embeddings for questions and answers results in late fusion of information related to matching questions to their answers. While critical for efficient retrieval, late fusion underperforms models that make use of early fusion (e.g., a BERT based classifier with cross-attention between question-answer pairs). We present a supervis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 21 publications
0
7
0
Order By: Relevance
“…The authors show that this approach is able to improve the effectiveness of document retrieval. Similarly, [101] presents a method for question answering that uses a neural retrieval component and a cross-attention component to generate a response based on the retrieved passages. They also propose a data augmentation method to improve the performance of the model.…”
Section: Dense Retrieval Methodsmentioning
confidence: 99%
“…The authors show that this approach is able to improve the effectiveness of document retrieval. Similarly, [101] presents a method for question answering that uses a neural retrieval component and a cross-attention component to generate a response based on the retrieved passages. They also propose a data augmentation method to improve the performance of the model.…”
Section: Dense Retrieval Methodsmentioning
confidence: 99%
“…Nie et al [93] proposed a decoupled encoding approach using DC-BERT, which enhances document retrieval effectiveness. Similarly, Yang et al [94] introduced a question-answering method that employs a neural retrieval component, a cross-attention mechanism for response generation, and a data augmentation technique to boost performance.…”
Section: ) Transformer-based Approachesmentioning
confidence: 99%
“…Oppositely, we focus on the more challenging one, which allows unanswerable questions and "non-contiguous" answer (Ravichander et al, 2021). In relevant literature works, retrieval augmented methods are applied in various contexts including privacy policies (e.g., Van et al (2021); Keymanesh et al (2021); Yang et al (2020)). Non-retrieval data aggregation has also been studied under different NLP contexts (e.g., bagging (Breiman, 1996), meta learning (Parvez et al, 2019)).…”
Section: Supplementary Material: Appendices a Related Workmentioning
confidence: 99%