Findings of the Association for Computational Linguistics: EMNLP 2021 2021
DOI: 10.18653/v1/2021.findings-emnlp.77
|View full text |Cite
|
Sign up to set email alerts
|

Few-Shot Table-to-Text Generation with Prototype Memory

Abstract: Neural table-to-text generation models have achieved remarkable progress on an array of tasks. However, due to the data-hungry nature of neural models, their performances strongly rely on large-scale training examples, limiting their applicability in real-world applications.To address this, we propose a new framework: Prototype-to-Generate (P2G), for table-to-text generation under the few-shot scenario. The proposed framework utilizes the retrieved prototypes, which are jointly selected by an IR system and a n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(17 citation statements)
references
References 22 publications
0
17
0
Order By: Relevance
“…Since the rising of GPT-2 [60] and BERT [16], the research community has witnessed remarkable progress in the field of language model pre-training on a large amount of free text. Such advancements have led to significant progresses in a wide range of natural language understanding (NLU) tasks [44,87,12,38] and text generation tasks tasks [60,39,61,71,76,75,92,74,77].…”
Section: A Related Workmentioning
confidence: 99%
“…Since the rising of GPT-2 [60] and BERT [16], the research community has witnessed remarkable progress in the field of language model pre-training on a large amount of free text. Such advancements have led to significant progresses in a wide range of natural language understanding (NLU) tasks [44,87,12,38] and text generation tasks tasks [60,39,61,71,76,75,92,74,77].…”
Section: A Related Workmentioning
confidence: 99%
“…Data-to-Text Generation Recently, retrievalaugmented generation has been adapted to the task of data-to-text generation. To bridge the gap between the structured data and natural language text, Su et al (2021a) propose a novel retrievalaugmented framework. Specifically, given the source data, a set of candidate texts are first retrieved from a large unlabelled corpus.…”
Section: Text Style Transfermentioning
confidence: 99%
“…al. [172] adopt a BM25 [268] based information retrieval (IR) system to their prototype-to-generate (P2G) framework, which, aided with their BERT-based prototype selector, retrieves contextual samples for the input data instance from Wikipedia, allowing for successful few shot learning in T5. For reasoning over tabulated sport summaries, Li et.…”
Section: Regularization Techniquesmentioning
confidence: 99%