2023
DOI: 10.1109/jbhi.2023.3274255
|View full text |Cite
|
Sign up to set email alerts
|

Uncertainty-Aware Multi-Dimensional Mutual Learning for Brain and Brain Tumor Segmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 18 publications
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…However, this method might lean towards learning incorrect volumetric structural information due to the resolution difference between the x, y, and z axes in MRIs (image anisotropy), overlooking important details within slices and causing distortion in segmentation results [ 25 , 26 ]. The third approach utilizes hybrid 2D-3D networks, combining the capabilities of 2D networks to focus on fine details within slices with the advantages of 3D networks in understanding the overall spatial structures [ [26] , [27] , [28] ], effectively solving the issue of spatial feature loss encountered with the use of only the 2D networks, as well as the issue of the 3D networks in dealing with the anisotropy of the images [ 29 ].…”
Section: Introductionmentioning
confidence: 99%
“…However, this method might lean towards learning incorrect volumetric structural information due to the resolution difference between the x, y, and z axes in MRIs (image anisotropy), overlooking important details within slices and causing distortion in segmentation results [ 25 , 26 ]. The third approach utilizes hybrid 2D-3D networks, combining the capabilities of 2D networks to focus on fine details within slices with the advantages of 3D networks in understanding the overall spatial structures [ [26] , [27] , [28] ], effectively solving the issue of spatial feature loss encountered with the use of only the 2D networks, as well as the issue of the 3D networks in dealing with the anisotropy of the images [ 29 ].…”
Section: Introductionmentioning
confidence: 99%