2019 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT) 2019
DOI: 10.1109/iciaict.2019.8784845
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning for Alzheimer's Disease Detection on MRI Images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(28 citation statements)
references
References 26 publications
0
28
0
Order By: Relevance
“…The second most popular strategy to apply transfer learning was fine-tuning certain parameters in a pretrained CNN [ 34 , 127 , 128 , 129 , 130 , 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 , 145 , 146 ]. The remaining approaches first optimized a feature extractor (typically a CNN or a SVM), and then trained a separated model (SVMs [ 30 , 45 , 147 , 148 , 149 ], long short-term memory networks [ 150 , 151 ], clustering methods [ 148 , 152 ], random forests [ 70 , 153 ], multilayer perceptrons [ 154 ], logistic regression [ 148 ], elastic net [ 155 ], CNNs [ 156 ]). Additionally, Yang et al [ 157 ] ensembled CNNs and fine-tuned their individual contribution.…”
Section: Resultsmentioning
confidence: 99%
“…The second most popular strategy to apply transfer learning was fine-tuning certain parameters in a pretrained CNN [ 34 , 127 , 128 , 129 , 130 , 131 , 132 , 133 , 134 , 135 , 136 , 137 , 138 , 139 , 140 , 141 , 142 , 143 , 144 , 145 , 146 ]. The remaining approaches first optimized a feature extractor (typically a CNN or a SVM), and then trained a separated model (SVMs [ 30 , 45 , 147 , 148 , 149 ], long short-term memory networks [ 150 , 151 ], clustering methods [ 148 , 152 ], random forests [ 70 , 153 ], multilayer perceptrons [ 154 ], logistic regression [ 148 ], elastic net [ 155 ], CNNs [ 156 ]). Additionally, Yang et al [ 157 ] ensembled CNNs and fine-tuned their individual contribution.…”
Section: Resultsmentioning
confidence: 99%
“…We consider three different popular open source deep learning models with CNN structures for the AD classification task. These models include VGG16 [12], ResNet50 [13], DenseNet121 [14] that were originally created for the ImageNet Large Scale Visual Recognition Challenge (ILSVRC) [24] and were shown to perform well for AD classification [18,21]. We briefly summarize the main characteristics of these models below.…”
Section: Prediction Modelsmentioning
confidence: 99%
“…In comparison, the VGG16 model that is trained from scratch was only able to achieve 74.12% accuracy. Transfer learning performed on MRI scans from the ADNI dataset using architectures such as VGGNet and ResNet [13] is also demonstrated to achieve better performance when compared to 3D-CNN model trained from scratch [21].…”
Section: Introductionmentioning
confidence: 96%
“…The second most popular strategy to apply transfer learning was fine-tuning certain parameters in a pretrained CNN [32,[125][126][127][128][129][130][131][132][133][134][135][136][137][138][139][140][141][142][143][144]. The remaining approaches first optimized a feature extractor (typically a CNN or a SVM), and then trained a separated model (SVMs [28,43,[145][146][147], long short-term memory networks [148,149], clustering methods [146,150], random forests [68,151], multilayer perceptrons [152], logistic regression [146], elastic net [153], CNNs [154]). Additionally, Yang et al [155] ensembled CNNs and fine-tuned their individual contribution.…”
Section: Parameter-based Approachesmentioning
confidence: 99%