2022 International Conference on Robotics and Automation (ICRA) 2022
DOI: 10.1109/icra46639.2022.9812106
|View full text |Cite
|
Sign up to set email alerts
|

Abnormal Occupancy Grid Map Recognition using Attention Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…But these methods ignore the linkage between global feature information and local feature information, which can afect the fusion of features and the generation of accurate attention maps. Deng et al built a csRSE module for occupancy grid map recognition [24], which contains a residual block for generating hierarchical features, followed by a channel SE block and a spatial SE block for adequate information extraction along the channel and space. To achieve more fexible computation allocation and content awareness, Zhu et al introduced the contentindependent sparsity into the attention mechanism and proposed the BiFormer which selectively attended to relevant tokens in an adaptive manner, without dispersing attention to other unrelated tokens [25].…”
Section: Related Workmentioning
confidence: 99%
“…But these methods ignore the linkage between global feature information and local feature information, which can afect the fusion of features and the generation of accurate attention maps. Deng et al built a csRSE module for occupancy grid map recognition [24], which contains a residual block for generating hierarchical features, followed by a channel SE block and a spatial SE block for adequate information extraction along the channel and space. To achieve more fexible computation allocation and content awareness, Zhu et al introduced the contentindependent sparsity into the attention mechanism and proposed the BiFormer which selectively attended to relevant tokens in an adaptive manner, without dispersing attention to other unrelated tokens [25].…”
Section: Related Workmentioning
confidence: 99%