2022
DOI: 10.48550/arxiv.2207.09603
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

AiATrack: Attention in Attention for Transformer Visual Tracking

Abstract: Transformer trackers have achieved impressive advancements recently, where the attention mechanism plays an important role. However, the independent correlation computation in the attention mechanism could result in noisy and ambiguous attention weights, which inhibits further performance improvement. To address this issue, we propose an attention in attention (AiA) module, which enhances appropriate correlations and suppresses erroneous ones by seeking consensus among all correlation vectors. Our AiA module c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 61 publications
0
3
0
Order By: Relevance
“…Trackers employing transformers can be categorized into two classes: CNN transformer-based trackers like STARK [6], AiATrack [25], TransT [18] and TrTr [26]. And fully-transformer based trackers like SwinTrack [27], MixFormer [19], Pro-ContEXT [28] and VideoTrack [29].…”
Section: Transformer-based Trackersmentioning
confidence: 99%
“…Trackers employing transformers can be categorized into two classes: CNN transformer-based trackers like STARK [6], AiATrack [25], TransT [18] and TrTr [26]. And fully-transformer based trackers like SwinTrack [27], MixFormer [19], Pro-ContEXT [28] and VideoTrack [29].…”
Section: Transformer-based Trackersmentioning
confidence: 99%
“…SiamFC++ [6] and SiamCAR [7] algorithms once again introduced the Anchor-Free strategy from object detection into the tracking field, alleviating the problem of hyperparameter sensitivity and improving tracking accuracy. TransT [8] , STARK [9] , TrDiMP [10] , and AiATrack [11] algorithms introduced Transformers for feature enhancement and fusion on the Siamese network, which greatly improved the tracking performance of the algorithms. However, it is clear that the performance improvement of the above algorithms comes at the expense of tracking speed.…”
Section: Siamese Network-based Object Tracking Algorithmmentioning
confidence: 99%
“…This tracker leverages the prior knowledge of similarity scores obtained in the early stages and proposes an in-network early candidate elimination module, thereby reducing inference time. To address the issue of inhibited performance improvements due to independent correlation calculations in attention mechanisms, AiATrack [22] introduces an Attention in Attention (AiA) module. This module enhances appropriate correlations and suppresses erroneous correlations by seeking consensus among all relevant vectors.…”
Section: Introductionmentioning
confidence: 99%