2022
DOI: 10.1007/s12539-021-00496-7
|View full text |Cite
|
Sign up to set email alerts
|

Multiple Protein Subcellular Locations Prediction Based on Deep Convolutional Neural Networks with Self-Attention Mechanism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 77 publications
0
5
0
Order By: Relevance
“…As noted above, the prediction of protein subcellular localization has always been a playground where the latest machine learning algorithms are introduced. In recent years, deep learning-based methods have become quite popular and thus a number of papers have been published within a few years ( Cong et al, 2020 , 2022 ; Semwal & Varadwaj, 2020 ; Jiang, Wang, Yao, et al, 2021 ; Liao et al, 2021 ; Yuan et al, 2021 ). The architecture of deep learning models has made rapid progress and they have also been applied to bioinformatics, such as protein design ( Ding et al, 2022 ).…”
Section: Deep Learning and Language Model-based Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…As noted above, the prediction of protein subcellular localization has always been a playground where the latest machine learning algorithms are introduced. In recent years, deep learning-based methods have become quite popular and thus a number of papers have been published within a few years ( Cong et al, 2020 , 2022 ; Semwal & Varadwaj, 2020 ; Jiang, Wang, Yao, et al, 2021 ; Liao et al, 2021 ; Yuan et al, 2021 ). The architecture of deep learning models has made rapid progress and they have also been applied to bioinformatics, such as protein design ( Ding et al, 2022 ).…”
Section: Deep Learning and Language Model-based Methodsmentioning
confidence: 99%
“…One of them is the use of (multi-head) self-attention mechanism, which was first introduced in Transformer (reviewed in Shreyashree et al, 2022 ). Both Jiang et al and Cong et al report the improvement of prediction performance with the use of the self-attention mechanism ( Jiang, Wang, Yao, et al, 2021 ; Cong et al, 2022 ). Jiang et al also claim that their method shows better performance in suborganellar prediction (see below).…”
Section: Deep Learning and Language Model-based Methodsmentioning
confidence: 99%
“…Mining deeper, Kaleel et al [53] ensemble Deep N-to-1 Convolutional Neural Networks that predict the location of the endomembrane system and secretory pathway versus all others and outperform many state-of-the-art web servers. Cong et al [54] proposed a self-evolving deep convolutional neural network (DCNN) protocol to solve the difficulties in feature correlation between sites and avoid the impact of unknown data distribution while using the self-attention mechanism [55] and a customized loss function to ensure the model performance. In addition, a long short-term memory network (LSTM) which combines the previous states and current inputs is also commonly used [56,57], with Generative Adversarial Network (GAN) [58] and Synthetic Minority Over-sampling Technique (SMOTE) [59] used for synthesizing minority samples to deal with data imbalance.…”
Section: Sequences-based Ai Approachesmentioning
confidence: 99%
“…An attention mechanism can be defined as a function that allows the model to focus on important information only, e.g., finding amino acids with higher attention weights, or focusing on attention weights of each function considered. Similarly, MPSLP [23] also uses a Self-Attention Mechanism with Deep Convolutional Neural Networks. This predictor integrates features through amino acid index distribution and physicochemical properties.…”
Section: Literature Reviewmentioning
confidence: 99%