2022
DOI: 10.3389/fnagi.2022.930584
|View full text |Cite
|
Sign up to set email alerts
|

An Attention-Based CoT-ResNet With Channel Shuffle Mechanism for Classification of Alzheimer’s Disease Levels

Abstract: Detection of early morphological changes in the brain and early diagnosis are important for Alzheimer’s disease (AD), and high-resolution magnetic resonance imaging (MRI) can be used to help diagnose and predict the disease. In this paper, we proposed two improved ResNet algorithms that introduced the Contextual Transformer (CoT) module, group convolution, and Channel Shuffle mechanism into the traditional ResNet residual blocks. The CoT module is used to replace the 3 × 3 convolution in the residual block to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 35 publications
0
5
0
Order By: Relevance
“…However, their implementation has raised ethical and cultural concerns (119). Initially introduced by Vaswani et al (120) for language translation, transformers are now widely used in text generation to predict the next word in a sentence. The transformer architecture's main innovation is the "attention mechanism, " which enables it to associate words that refer to the same concept.…”
Section: Transformersmentioning
confidence: 99%
“…However, their implementation has raised ethical and cultural concerns (119). Initially introduced by Vaswani et al (120) for language translation, transformers are now widely used in text generation to predict the next word in a sentence. The transformer architecture's main innovation is the "attention mechanism, " which enables it to associate words that refer to the same concept.…”
Section: Transformersmentioning
confidence: 99%
“…Based on the present results, we speculate that our framework could serve not only to enhance the interpretability of a black-box model which achieves state-of-the-art classification performances, thus addressing the problem of the trade-off between accuracy and trustworthiness [65], but also as a guide for experts to facilitate the extraction of rsEEG markers of cognitive decay. A recent work employed the attention mechanism to design an EEG channel interpolation algorithm [66]. Similarly, our method could be exploited also in different applications to select relevant domain-specific information by taking into account short and long temporal dependencies of the signal.…”
Section: General Remarksmentioning
confidence: 99%
“…The cross‐channel fusion attention module first groups the dimensions of channels into sub‐features, and for each sub‐feature, a Shuffle unit is used to describe the spatial and channel correlation of the features. After that, all the sub‐features are aggregated and the “Channel Shuffle” [32] operator is used to communicate information between the different sub‐features. The multi‐branch structure allows the model to become deeper and easier to train.…”
Section: The Proposed Theorymentioning
confidence: 99%