Proceedings of the 22nd Conference on Computational Natural Language Learning 2018
DOI: 10.18653/v1/k18-1030
|View full text |Cite
|
Sign up to set email alerts
|

Sequence Classification with Human Attention

Abstract: Learning attention functions requires large volumes of data, but many NLP tasks simulate human behavior, and in this paper, we show that human attention really does provide a good inductive bias on many attention functions in NLP. Specifically, we use estimated human attention derived from eyetracking corpora to regularize attention functions in recurrent neural networks. We show substantial improvements across a range of tasks, including sentiment analysis, grammatical error detection, and detection of abusiv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
100
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 93 publications
(100 citation statements)
references
References 48 publications
0
100
0
Order By: Relevance
“…This also opens the question of whether a different architecture could better suit the purpose of leveraging the gaze information in a consistent way. In this context, a potential line of work could adapt human-attention approaches (Barrett et al, 2018) for structured prediction and word-level classification, although it would come at a cost of speed for parsing as sequence labeling (Strzyz et al, 2019b).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…This also opens the question of whether a different architecture could better suit the purpose of leveraging the gaze information in a consistent way. In this context, a potential line of work could adapt human-attention approaches (Barrett et al, 2018) for structured prediction and word-level classification, although it would come at a cost of speed for parsing as sequence labeling (Strzyz et al, 2019b).…”
Section: Discussionmentioning
confidence: 99%
“…Our work focuses on exploiting the utility of gaze information using just a standard BILSTM, directly building on top of previous work of dependency parsing as sequence labeling (Strzyz et al, 2019b), and ignoring extra tools such as attention. In this line, a future possible solution could be to apply the approach by Barrett et al (2018) to structured prediction and word-level classification. In their work they used human data as an inductive bias to update the attention weights of the network.…”
Section: Gaze Informationmentioning
confidence: 99%
See 1 more Smart Citation
“…In this paper, we merely use the TRT feature, which represents total human attention on words during reading. This feature is also used by Carpenter and Just (1983) and Barrett et al (2018). We then di-vide TRT values by the number of participants to get an average TRT (ATRT).…”
Section: Eye-tracking Corpusmentioning
confidence: 99%
“…The reading time of per-word is the indicative of textual (as well as lexical, syntactic and semantic) processing (Demberg and Keller, 2008), which reflects human attention on various content. To obtain human attention during reading, this paper estimates eye fixation duration from eye-tracking corpus inspired by Carpenter and Just (1983) and Barrett et al (2018). The modern-day eye tracking equipment resulting in a very rich and detailed dataset (Cop et al, 2017).…”
Section: Introductionmentioning
confidence: 99%