2021
DOI: 10.48550/arxiv.2108.12516
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Few-Shot Table-to-Text Generation with Prototype Memory

Abstract: Neural table-to-text generation models have achieved remarkable progress on an array of tasks. However, due to the data-hungry nature of neural models, their performances strongly rely on large-scale training examples, limiting their applicability in real-world applications.To address this, we propose a new framework: Prototype-to-Generate (P2G), for table-to-text generation under the few-shot scenario. The proposed framework utilizes the retrieved prototypes, which are jointly selected by an IR system and a n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…(Li et al, 2021) uses a prefix set of tokens to better control the topic of the generated text. (Su et al, 2021a) extracts free-form prototypes from a large knowledge base to control the structural formation of the generated text. One of the recent works.…”
Section: Categorymentioning
confidence: 99%
“…(Li et al, 2021) uses a prefix set of tokens to better control the topic of the generated text. (Su et al, 2021a) extracts free-form prototypes from a large knowledge base to control the structural formation of the generated text. One of the recent works.…”
Section: Categorymentioning
confidence: 99%