2023
DOI: 10.1016/j.media.2023.102802
|View full text |Cite
|
Sign up to set email alerts
|

Transformers in medical imaging: A survey

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
97
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 381 publications
(97 citation statements)
references
References 167 publications
0
97
0
Order By: Relevance
“…Swin UNETR computes self-attention via an efficient shifting window partitioning algorithm and ranks first on the BraTs 2021 [ 38 ] validation set [ 48 ]. This approach, which is commonly used in medical imaging applications, is built on top of a SWin Transformer to extract and down-sample feature maps before feeding them into a Transformer [ 49 ].…”
Section: Segmentationmentioning
confidence: 99%
“…Swin UNETR computes self-attention via an efficient shifting window partitioning algorithm and ranks first on the BraTs 2021 [ 38 ] validation set [ 48 ]. This approach, which is commonly used in medical imaging applications, is built on top of a SWin Transformer to extract and down-sample feature maps before feeding them into a Transformer [ 49 ].…”
Section: Segmentationmentioning
confidence: 99%
“…Transformer and its variants have been demonstrated to be effective in various vision tasks [22, 3032]. For example, TNT [22] employs it for fine‐grained image tasks, where external and internal Transformers are used to extract global and local features.…”
Section: Related Workmentioning
confidence: 99%
“…This has allowed for a more comprehensive and effective diagnosis of diseases, thereby improving human health care. Understanding the overall context between pixels against the background can help models avoid mis-segmentation [18].…”
Section: Applications Of Mismentioning
confidence: 99%