2018
DOI: 10.3390/app8081372
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection and Transfer Learning for Alzheimer’s Disease Clinical Diagnosis

Abstract: Background and Purpose:A majority studies on diagnosis of Alzheimer's Disease (AD) are based on an assumption: the training and testing data are drawn from the same distribution. However, in the diagnosis of AD and mild cognitive impairment (MCI), this identical-distribution assumption may not hold. To solve this problem, we utilize the transfer learning method into the diagnosis of AD. Methods: The MR (Magnetic Resonance) images were segmented using spm-Dartel toolbox and registrated with Automatic Anatomical… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 43 publications
(16 citation statements)
references
References 28 publications
0
16
0
Order By: Relevance
“…Images’ weights were derived by minimizing the distance (Kullback-Leibler divergence, Bhattacharyya, squared Euclidean, and maximum mean discrepancy) between the PDFs in the source and target domains [ 47 , 48 , 49 , 50 ]. On the other hand, supervised strategies optimized images’ weights and the ML model simultaneously by minimizing the corresponding task-specific loss [ 45 , 51 , 52 , 53 , 54 ]. Notably, supervised and unsupervised strategies can be combined, and approaches can also incorporate extra information unused in the main task.…”
Section: Resultsmentioning
confidence: 99%
“…Images’ weights were derived by minimizing the distance (Kullback-Leibler divergence, Bhattacharyya, squared Euclidean, and maximum mean discrepancy) between the PDFs in the source and target domains [ 47 , 48 , 49 , 50 ]. On the other hand, supervised strategies optimized images’ weights and the ML model simultaneously by minimizing the corresponding task-specific loss [ 45 , 51 , 52 , 53 , 54 ]. Notably, supervised and unsupervised strategies can be combined, and approaches can also incorporate extra information unused in the main task.…”
Section: Resultsmentioning
confidence: 99%
“…Several studies in the machine learning area have highlighted the importance of feature selection for the improvement of classification performance [ 20 , 21 ]. For example, an effective feature selection method based on computing the chi-square statistical value was introduced in [ 22 ].…”
Section: Background Of the Studymentioning
confidence: 99%
“…The capability and usefulness of the SVM classification method to discriminate AD patients from healthy ones was demonstrated in other studies [27,28,68]. However, it must be stressed that these works made use of structural brain features only, i.e., morphological information derived from MRI [68]. Even when a reduced number of feature is employed, this method should be regarded as a partial and limited approach to the development of a reliable diagnostic tool.…”
Section: Support Vector Machine Classifiermentioning
confidence: 99%