Proceedings of the Canadian Conference on Artificial Intelligence 2021
DOI: 10.21428/594757db.90170c50
|View full text |Cite
|
Sign up to set email alerts
|

Sentiment Analysis with Cognitive Attention Supervision

Abstract: Neural network-based language models such as BERT (Bidirectional Encoder Representations from Transformers) use attention mechanisms to create contextualized representations of inputs, conceptually analogous to humans reading words in context. For the task of classifying the sentiment of texts, we ask whether BERT's attention can be informed by human cognitive data. During training, we supervise attention with eye-tracking and/or brain imaging data and combine binary sentiment classification loss with these at… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…Among the few studies on attention supervision, [18] showed that supervision can harm classification performance in sentiment classification tasks. Regularization was considered to circumvent the issue of a rather flat distribution of attention weights as reported by [13].…”
Section: Related Workmentioning
confidence: 99%
“…Among the few studies on attention supervision, [18] showed that supervision can harm classification performance in sentiment classification tasks. Regularization was considered to circumvent the issue of a rather flat distribution of attention weights as reported by [13].…”
Section: Related Workmentioning
confidence: 99%