2021
DOI: 10.1016/j.neuroimage.2020.117649
|View full text |Cite
|
Sign up to set email alerts
|

DIKA-Nets: Domain-invariant knowledge-guided attention networks for brain skull stripping of early developing macaques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 43 publications
0
8
0
Order By: Relevance
“…image noise, bias field in [21]) and no a-priori information are required (e.g. center of gravity distance map, signed distance map in [18]).…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…image noise, bias field in [21]) and no a-priori information are required (e.g. center of gravity distance map, signed distance map in [18]).…”
Section: Discussionmentioning
confidence: 99%
“…We complete our comparison with state of the art by positioning our performances in relation to methods applied to other animal models: macaques [18,13], rodents [17,23,24,13] and pigs [21]. This report of performance of related works is for information purposes since dataset are different and all third party codes are not freely available [18,21,23].…”
Section: Assd(𝐼 𝐼mentioning
confidence: 99%
See 2 more Smart Citations
“…Brain tissue extraction focuses on removing non-brain tissues, such as the skull, muscles, and eyes, and then preserving the brain tissue [ 12 , 13 , 14 ]. Numerous software tools, for use on the human brain for skull stripping, have been developed, such as the Brain Extraction Tool (BET) in FSL [ 15 , 16 ], 3dSkullStrip in AFNI [ 17 , 18 , 19 ], and the hybrid watershed algorithm (HWA) in FreeSurfer [ 20 , 21 ]. Although these tools can perform well, when applied to the human brain, their performance is lacking when used on the macaque brain, mainly due to the image differences between the macaque and human brains [ 22 ].…”
Section: Introductionmentioning
confidence: 99%