2018
DOI: 10.48550/arxiv.1811.05544
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

An Introductory Survey on Attention Mechanisms in NLP Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…The attention mechanism has achieved great success and is commonly used in seq2seq models for different natural language processing (NLP) tasks [103], such as machine translation [23,21], image captioning [104], and neural abstractive text summarization [14,12,17]. In an attention based encoderdecoder architecture (shown in Fig.…”
Section: B Attention Mechanismmentioning
confidence: 99%
“…The attention mechanism has achieved great success and is commonly used in seq2seq models for different natural language processing (NLP) tasks [103], such as machine translation [23,21], image captioning [104], and neural abstractive text summarization [14,12,17]. In an attention based encoderdecoder architecture (shown in Fig.…”
Section: B Attention Mechanismmentioning
confidence: 99%