2021
DOI: 10.3788/lop202158.0810018
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Semantic Segmentation Network Based on Regional Self-Attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 0 publications
0
0
0
Order By: Relevance
“…The attention mechanism seeks to identify relevant information and disregard irrelevant information, thereby enhancing the efficiency of neural networks. By obtaining detailed information and suppressing unnecessary data, it becomes possible to improve the network's performance [29,30]. In order to do this, we suggest a fusion approach that combines the cross-stage partial (CSP) module built into the convolutional block attention module (CBAM) attention mechanism with the global attentional map (GAM) mechanism.…”
Section: Improvement Of Network Structurementioning
confidence: 99%
“…The attention mechanism seeks to identify relevant information and disregard irrelevant information, thereby enhancing the efficiency of neural networks. By obtaining detailed information and suppressing unnecessary data, it becomes possible to improve the network's performance [29,30]. In order to do this, we suggest a fusion approach that combines the cross-stage partial (CSP) module built into the convolutional block attention module (CBAM) attention mechanism with the global attentional map (GAM) mechanism.…”
Section: Improvement Of Network Structurementioning
confidence: 99%