2022
DOI: 10.48550/arxiv.2202.01110
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Survey on Retrieval-Augmented Text Generation

Abstract: Recently, retrieval-augmented text generation attracted increasing attention of the computational linguistics community. Compared with conventional generation models, retrievalaugmented text generation has remarkable advantages and particularly has achieved state-ofthe-art performance in many NLP tasks. This paper aims to conduct a survey about retrievalaugmented text generation. It firstly highlights the generic paradigm of retrieval-augmented generation, and then it reviews notable approaches according to di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 42 publications
0
5
0
Order By: Relevance
“…Our experiments demonstrate that platforms have a variety of ways to prioritize cost vs. accuracy. Strategies include using smaller, less expensive LLMs (e.g., textbison vs. text-unicorn); simplifying policies via automatic prompt optimization [43], [56]; or optimizing the number of examples in a prompt or even the candidate selection strategy [29]. We envision that real-world deployments will consist of distinct layers of LLM raters each with supporting RAG databases.…”
Section: Optimizations and Future Workmentioning
confidence: 99%
“…Our experiments demonstrate that platforms have a variety of ways to prioritize cost vs. accuracy. Strategies include using smaller, less expensive LLMs (e.g., textbison vs. text-unicorn); simplifying policies via automatic prompt optimization [43], [56]; or optimizing the number of examples in a prompt or even the candidate selection strategy [29]. We envision that real-world deployments will consist of distinct layers of LLM raters each with supporting RAG databases.…”
Section: Optimizations and Future Workmentioning
confidence: 99%
“…In generative LLMs, the technique of Retrieval Augmented Generation (RAG) is used to supply external context into the model's input [15] with the goal of augmenting its output quality in terms of factuality. More specifically, the original input is used to retrieve context based on a criterion, usually embedding similarity, from a pool of available factually correct and externally provided texts.…”
Section: Retrieval Augmented Generationmentioning
confidence: 99%
“…In CoALA, a retrieval procedure (Li et al, 2022a;Gu et al, 2018) reads information from long-term memories into working memory. Depending on the information and memory type, it could be implemented in various ways, e.g., rule-based, sparse, or dense retrieval.…”
Section: Retrieval Actionsmentioning
confidence: 99%