2022
DOI: 10.48550/arxiv.2207.05526
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Long-term Leap Attention, Short-term Periodic Shift for Video Classification

Hao Zhang,
Lechao Cheng,
Yanbin Hao
et al.

Abstract: Video transformer naturally incurs a heavier computation burden than a static vision transformer, as the former processes times longer sequence than the latter under the current attention of quadratic complexity ( 2 2 ). The existing works treat the temporal axis as a simple extension of spatial axes, focusing on shortening the spatio-temporal sequence by either generic pooling or local windowing without utilizing temporal redundancy.However, videos naturally contain redundant information between neighboring f… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 44 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?