2021
DOI: 10.1016/j.imavis.2021.104246
|View full text |Cite
|
Sign up to set email alerts
|

Flow guided mutual attention for person re-identification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 8 publications
0
11
0
Order By: Relevance
“…A gated attention network typically receives a gating signal from another module that provides contextual information (Kiran et al, 2021;Bhuiyan et al, 2020;Subramaniam et al, 2019). We here propose to use a gating signal from one modality to act on the backbone of the other modality during training and thereby especially reinforce the features needed for cross-modal matching.…”
Section: Gated Attention Networkmentioning
confidence: 99%
See 4 more Smart Citations
“…A gated attention network typically receives a gating signal from another module that provides contextual information (Kiran et al, 2021;Bhuiyan et al, 2020;Subramaniam et al, 2019). We here propose to use a gating signal from one modality to act on the backbone of the other modality during training and thereby especially reinforce the features needed for cross-modal matching.…”
Section: Gated Attention Networkmentioning
confidence: 99%
“…To increase the feature granularity, there is a common trend to use the attention mechanism to address the issue of misalignment in reidentifications. Inspired by the recent success of the gated attention mechanism (Kiran et al, 2021;Bhuiyan et al, 2020;Subramaniam et al, 2019), we propose to additionally integrate a cross-modal gated attention mechanism to mitigate the misalignment issue by dynamically selecting the CNN filters. Most of these state-of-the-art approaches use different contextual information to gate the backbone architecture.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations