2020
DOI: 10.3389/fnins.2019.01456
|View full text |Cite
|
Sign up to set email alerts
|

QC-Automator: Deep Learning-Based Automated Quality Control for Diffusion MR Images

Abstract: Quality assessment of diffusion MRI (dMRI) data is essential prior to any analysis, so that appropriate pre-processing can be used to improve data quality and ensure that the presence of MRI artifacts do not affect the results of subsequent image analysis. Manual quality assessment of the data is subjective, possibly error-prone, and infeasible, especially considering the growing number of consortium-like studies, underlining the need for automation of the process. In this paper, we have developed a deep-learn… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 26 publications
(18 citation statements)
references
References 38 publications
0
18
0
Order By: Relevance
“…The second most popular strategy to apply transfer learning was fine-tuning certain parameters in a pretrained CNN [ 34 , 127 , 128 , 129 , 130 , 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 , 145 , 146 ]. The remaining approaches first optimized a feature extractor (typically a CNN or a SVM), and then trained a separated model (SVMs [ 30 , 45 , 147 , 148 , 149 ], long short-term memory networks [ 150 , 151 ], clustering methods [ 148 , 152 ], random forests [ 70 , 153 ], multilayer perceptrons [ 154 ], logistic regression [ 148 ], elastic net [ 155 ], CNNs [ 156 ]).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The second most popular strategy to apply transfer learning was fine-tuning certain parameters in a pretrained CNN [ 34 , 127 , 128 , 129 , 130 , 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 , 145 , 146 ]. The remaining approaches first optimized a feature extractor (typically a CNN or a SVM), and then trained a separated model (SVMs [ 30 , 45 , 147 , 148 , 149 ], long short-term memory networks [ 150 , 151 ], clustering methods [ 148 , 152 ], random forests [ 70 , 153 ], multilayer perceptrons [ 154 ], logistic regression [ 148 ], elastic net [ 155 ], CNNs [ 156 ]).…”
Section: Resultsmentioning
confidence: 99%
“…Fine-tuning all parameters (prior-sharing) and fine-tuning certain parameters (parameter-sharing) were widely used methods, although in the latter case we rarely found justifications for choosing which parameters to fine-tune. Since the first layers of CNNs capture low-level information, such as borders and corners, various studies [ 34 , 134 , 135 , 137 , 145 ] have considered that those parameters can be shared across domains. Besides, as adapting pretrained CNNs to the target domain data requires, at least, replacing the last layer of these models, researchers have likely turn fine-tuning only this randomly-initialized layer into common practice, although we found no empirical studies that supported such practice.…”
Section: Discussionmentioning
confidence: 99%
“…Each classifier must detect one sub-concept of the current node from the others. One could use a stronger classification approach to have a better summarization method [49].…”
Section: Hierarchical Classificationmentioning
confidence: 99%
“…Deep learning tools, especially convolutional neural networks (CNNs), are particularly effective in elucidating local patterns 20 , 21 . They learn the underlying features of importance as well as the discrimination algorithm.…”
Section: Introductionmentioning
confidence: 99%