2022
DOI: 10.3389/fncom.2022.1000435
|View full text |Cite
|
Sign up to set email alerts
|

Transfer learning-based modified inception model for the diagnosis of Alzheimer's disease

Abstract: Alzheimer's disease (AD) is a neurodegenerative ailment, which gradually deteriorates memory and weakens the cognitive functions and capacities of the body, such as recall and logic. To diagnose this disease, CT, MRI, PET, etc. are used. However, these methods are time-consuming and sometimes yield inaccurate results. Thus, deep learning models are utilized, which are less time-consuming and yield results with better accuracy, and could be used with ease. This article proposes a transfer learning-based modifie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 30 publications
0
4
0
Order By: Relevance
“…Their work obtained 94.93%, 94.94%, 98.3%, and 94.92% precision, recall, specificity, and accuracy, respectively. Without any details, the authors stated that the work could not guarantee reproducibility 15 .…”
Section: Labeling Dementia Stagesmentioning
confidence: 99%
“…Their work obtained 94.93%, 94.94%, 98.3%, and 94.92% precision, recall, specificity, and accuracy, respectively. Without any details, the authors stated that the work could not guarantee reproducibility 15 .…”
Section: Labeling Dementia Stagesmentioning
confidence: 99%
“…CNNs have also been extended with additional layers to make the model deeper to classify pulmonary nodules [21]. A shallow version of the Inception model was used to diagnose Alzheimer's disease [22]. Deeper and wider CNNs have been evaluated for a variety of computer-aided detection algorithms with transfer learning [23].…”
Section: Transfer Learningmentioning
confidence: 99%
“…The skip connection links straight to the output after skipping a few stages of training. The benefit of including this type of skip connection is that regularization will skip any layer that degrades the architecture's performance [65] As a result, an intense neural network can be trained without the issues caused by vanishing/exploding gradients. ResNet152 [52] is a 152-layer CNN, and its modified architecture is displayed in Figure 4.…”
Section: Architecture Of Resnet152mentioning
confidence: 99%