Proceedings of the 2019 International Symposium on Signal Processing Systems 2019
DOI: 10.1145/3364908.3365287
|View full text |Cite
|
Sign up to set email alerts
|

Two-Level Model for Table-to-Text Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 8 publications
0
5
0
Order By: Relevance
“…Table 6 compares SAN-T2T with KL (Konstas and Lapata, 2013a), MBW (Mei et al, 2016), and Two-level model (Cao et al, 2019) on WeatherGov. There is no need to apply copy mechanism because its vocabulary size is small and each sample has 36 records of fixed length, but it could also be concluded that beam search has little improvements, which is consistent with Mei et al (2016).…”
Section: Results and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Table 6 compares SAN-T2T with KL (Konstas and Lapata, 2013a), MBW (Mei et al, 2016), and Two-level model (Cao et al, 2019) on WeatherGov. There is no need to apply copy mechanism because its vocabulary size is small and each sample has 36 records of fixed length, but it could also be concluded that beam search has little improvements, which is consistent with Mei et al (2016).…”
Section: Results and Analysismentioning
confidence: 99%
“…As shown in Table 2, SAN-T2T is compared with several previous works, including KN, Template KN (Heafield et al, 2013), NLM, Table NLM (Lebret et al, 2016), Table2Seq-Single (Bao et al, 2019), Structure-aware Seq2Seq (Liu et al, 2018), and Two-level model (Cao, Gong, and Zhang, 2019). Meanwhile, we provide a vanilla Seq2Seq model (without position encoding and content selector compared to SAN-T2T) and a Transformer model (Vaswani et al, 2017) to conduct an ablation study.…”
Section: Experiments and Analysismentioning
confidence: 99%
“…An unordered record is formed by several key-value pairs, while an entity 𝑘 𝑖 is also composed of several unordered disciplines (Figure 1). proposed a two-level model for table-to-text generation that employs an improved encoding-decoding architecture using two LSTM-RNNs for field prioritization and value prioritization in the encoder segment, and similarly a two-level attention mechanism at the decoding end to obtain the relationship between the words in the text and the fields in the table [5].…”
Section: Formal Description Of the Taskmentioning
confidence: 99%
“…This paper is an extended version of our earlier works [4] and [5]. Reference [4] introduced a two-level model for table-to-text generation and [5] improved the model for open domain.…”
Section: Introductionmentioning
confidence: 99%
“…This paper is an extended version of our earlier works [4] and [5]. Reference [4] introduced a two-level model for table-to-text generation and [5] improved the model for open domain. This paper integrates the two with other methods to form a complete and relatively mature architecture, and greatly strengthens the interpretation and analysis.…”
Section: Introductionmentioning
confidence: 99%