2023
DOI: 10.3390/diagnostics13091658
|View full text |Cite
|
Sign up to set email alerts
|

SAA-UNet: Spatial Attention and Attention Gate UNet for COVID-19 Pneumonia Segmentation from Computed Tomography

Abstract: The disaster of the COVID-19 pandemic has claimed numerous lives and wreaked havoc on the entire world due to its transmissible nature. One of the complications of COVID-19 is pneumonia. Different radiography methods, particularly computed tomography (CT), have shown outstanding performance in effectively diagnosing pneumonia. In this paper, we propose a spatial attention and attention gate UNet model (SAA-UNet) inspired by spatial attention UNet (SA-UNet) and attention UNet (Att-UNet) to deal with the problem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 56 publications
0
1
0
Order By: Relevance
“…It can be seen that the use of machine learning algorithms can enable the analysis of vast amounts of data collected from sensors, providing valuable insights into the physiological well-being of workers in confined spaces [ 21 ]. By training models to recognize patterns indicative of abnormal vital signs, early warning systems can be developed, alerting both the workers themselves and supervisors to the need for immediate intervention [ 22 , 23 ].…”
Section: Introductionmentioning
confidence: 99%
“…It can be seen that the use of machine learning algorithms can enable the analysis of vast amounts of data collected from sensors, providing valuable insights into the physiological well-being of workers in confined spaces [ 21 ]. By training models to recognize patterns indicative of abnormal vital signs, early warning systems can be developed, alerting both the workers themselves and supervisors to the need for immediate intervention [ 22 , 23 ].…”
Section: Introductionmentioning
confidence: 99%