2017 IEEE International Conference on Computer Vision (ICCV) 2017
DOI: 10.1109/iccv.2017.80
|View full text |Cite
|
Sign up to set email alerts
|

A Read-Write Memory Network for Movie Story Understanding

Abstract: We propose a novel memory network model named Read-Write Memory Network (RWMN) to perform question and answering tasks for large-scale, multimodal movie story understanding. The key focus of our RWMN model is to design the read network and the write network that consist of multiple convolutional layers, which enable memory read and write operations to have high capacity and flexibility. While existing memory-augmented network models treat each memory slot as an independent block, our use of multi-layered CNNs … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
78
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 96 publications
(78 citation statements)
references
References 25 publications
0
78
0
Order By: Relevance
“…While iteratively taking question and answers for temporal attention generation, memories are progressively updated to accumulate cues to locate relevant temporal parts for answering the question. We observed that single-step temporal attention on memory networks [26,21] often generates blurry attention map. The multi-step nature of progressive attention mechanism enables generating sharper attention distribution.…”
Section: Progressive Attention Mechanismmentioning
confidence: 86%
See 4 more Smart Citations
“…While iteratively taking question and answers for temporal attention generation, memories are progressively updated to accumulate cues to locate relevant temporal parts for answering the question. We observed that single-step temporal attention on memory networks [26,21] often generates blurry attention map. The multi-step nature of progressive attention mechanism enables generating sharper attention distribution.…”
Section: Progressive Attention Mechanismmentioning
confidence: 86%
“…The attention mechanism have widely been adopted to retrieve the information relevant to the question. We observed that single-step attention on memory networks [21,26] often generates blurred temporal attention map.…”
Section: Introductionmentioning
confidence: 85%
See 3 more Smart Citations