2020
DOI: 10.1109/access.2020.2979115
|View full text |Cite
|
Sign up to set email alerts
|

Generating Natural Language Descriptions From Tables

Abstract: This paper proposes a neural generative architecture, namely NLDT, to generate a natural language short text describing a table which has formal structure and valuable information. Specifically, the architecture maps fields and values of a table to continuous vectors and then generates a natural language description by leveraging the semantics of a table. The NLDT architecture adopts a two-level neural model to make the most of the structure of a table to fully express the relationship between contents. To dea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 26 publications
0
5
0
Order By: Relevance
“…Wang, Hsiao, and Chang (2020) draw their attention to communicative language technology that assists in writing an automatic paper based on an RNN and the TextRank algorithm. Cao (2020) studies the process of generating natural language descriptions from different kinds of tables. Zhu et al (2020) outline modeling graph structure in a transformer for better abstract meaning representation to text generation.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Wang, Hsiao, and Chang (2020) draw their attention to communicative language technology that assists in writing an automatic paper based on an RNN and the TextRank algorithm. Cao (2020) studies the process of generating natural language descriptions from different kinds of tables. Zhu et al (2020) outline modeling graph structure in a transformer for better abstract meaning representation to text generation.…”
Section: Literature Reviewmentioning
confidence: 99%
“…More recent studies have utilized the capabilities of pre-trained language models in their designs, but have also incorporated specialized encoder structures or attention mechanisms specifically for table inputs. These include encoder-only models (Arik and Pfister, 2019;Yin et al, 2020;Herzig et al, 2020;Huang et al, 2020;Iida et al, 2021;Eisenschlos et al, 2021;Yang et al, 2022), as well as encoder-decoder models (Cao, 2020;Andrejczuk et al, 2022;. However, it should be noted that the encoder structures of these works are specifically tailored for table input and cannot be directly applied to other types of data.…”
Section: Related Workmentioning
confidence: 99%
“…In another research Cao et al [35] generates short text that explains a given table. A neural generative architecture known as NLDT was proposed by the researcher.…”
Section: Miscellaneousmentioning
confidence: 99%