2019
DOI: 10.48550/arxiv.1905.08407
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generating Logical Forms from Graph Representations of Text and Entities

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
10
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 0 publications
0
10
0
Order By: Relevance
“…A Hidden Markov Model based chart (bar, line, etc) recognition method is proposed in [14]. Graph neural networks have been employed in [5] to generate logical forms with entities from free text using BERT. In a very recent work [6], Obeid et al have used transformer based models for text generation from charts.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A Hidden Markov Model based chart (bar, line, etc) recognition method is proposed in [14]. Graph neural networks have been employed in [5] to generate logical forms with entities from free text using BERT. In a very recent work [6], Obeid et al have used transformer based models for text generation from charts.…”
Section: Related Workmentioning
confidence: 99%
“…Among them a few example tasks are: document summarization [1], title or caption generation from texts, generating textual description of charts [2], named entity recognition [3], etc. There has been several attempts to generate graphs or structural elements from natural language texts or free texts [4,5,6,7]. Scientific charts (bar, line, pie, etc.)…”
Section: Introductionmentioning
confidence: 99%
“…While traditional approaches (Kate & Mooney, 2006;Wong & Mooney, 2007;Clarke et al, 2010;Zettlemoyer & Collins, 2007;Kwiatkowski et al, 2011;Wang et al, 2015;Li et al, 2015;Cai & Yates, 2013;Berant et al, 2013;Quirk et al, 2015;Artzi et al, 2015;Zhang et al, 2017) rely on high-quality lexicons, manually-built templates, and/or domain or representation specific features, in more recent studies neural models with encoder-decoder architectures show impressing results (Dong & Lapata, 2016;Jia & Liang, 2016;Herzig & Berant, 2017;Su & Yan, 2017;B. Chen & Han, 2018;Shaw et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…Very recently, Shaw et al (2019) presents an approach that uses a Graph Neural Network (GNN) architecture, which successfully incorporates information about relevant entities and their relations in parsing natural utterances. Similar to Vinyals et al (2015); Jia & Liang (2016); Herzig & Berant (2017), the decoder has a copying mechanism, which can copy an entity to the output during parsing.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation