Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2016
DOI: 10.18653/v1/p16-1086
|View full text |Cite
|
Sign up to set email alerts
|

Text Understanding with the Attention Sum Reader Network

Abstract: Several large cloze-style context-questionanswer datasets have been introduced recently: the CNN and Daily Mail news data and the Children's Book Test. Thanks to the size of these datasets, the associated text comprehension task is well suited for deep-learning techniques that currently seem to outperform all alternative approaches. We present a new, simple model that uses attention to directly pick the answer from the context as opposed to computing the answer using a blended representation of words in the do… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
255
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 247 publications
(255 citation statements)
references
References 10 publications
0
255
0
Order By: Relevance
“…Hill et al (2016) propose a windowbased memory network for CBT dataset. Kadlec et al (2016) introduce pointer networks with one attention step to predict the blanking out entities. Sordoni et al (2016) propose an iterative alternating attention mechanism to better model the links between question and passage.…”
Section: Related Workmentioning
confidence: 99%
“…Hill et al (2016) propose a windowbased memory network for CBT dataset. Kadlec et al (2016) introduce pointer networks with one attention step to predict the blanking out entities. Sordoni et al (2016) propose an iterative alternating attention mechanism to better model the links between question and passage.…”
Section: Related Workmentioning
confidence: 99%
“…Our Chengyu cloze test task is similar to reading comprehension (Hermann et al, 2015;Cui et al, 2016;Kadlec et al, 2016;Seo et al, 2016 (Xu et al, 2010) and improve Chinese word segmentation (Chan and Chong, 2008;Sun and Xu, 2011;Wang and Xu, 2017). Chengyus differ from metaphors in other languages (Tsvetkov et al, 2014;Shutova, 2010) because they do not follow the grammatical structure and syntax of the modern Chinese.…”
Section: Related Workmentioning
confidence: 99%
“…These tasks include but are not limited to: performing reasoning over a simulated environment for QA , factoid and non-factoid based QA using both knowledge bases and unstructured text (Kumar et al, 2015;Hill et al, 2016;Chandar et al, 2016;Bordes et al, 2015), goal driven dialog Dodge et al, 2016;Weston, 2016), automatic story comprehension from both video and text (Tapaswi et al, 2015), and, transferring knowledge from one knowledge-base while learning to answer questions on a different knowledge base (Bordes et al, 2015). Recently, various other attention based neural models (similar to Memory Networks) have been proposed to tackle the machine comprehension task by QA from unstructured text (Kadlec et al, 2016;Sordoni et al, 2016;Chen et al, 2016). To the best of our knowledge, knowledge transfer from an unstructured text dataset to another unstructured text dataset for machine comprehension is not explored yet.…”
Section: Related Workmentioning
confidence: 99%