2023
DOI: 10.1021/acs.jcim.3c01017
|View full text |Cite
|
Sign up to set email alerts
|

AMPpred-MFA: An Interpretable Antimicrobial Peptide Predictor with a Stacking Architecture, Multiple Features, and Multihead Attention

Changjiang Li,
Quan Zou,
Cangzhi Jia
et al.

Abstract: Antimicrobial peptides (AMPs) are small molecular polypeptides that can be widely used in the prevention and treatment of microbial infections. Although many computational models have been proposed to help identify AMPs, a high-performance and interpretable model is still lacking. In this study, new benchmark data sets are collected and processed, and a stacking deep architecture named AMPpred-MFA is carefully designed to discover and identify AMPs. Multiple features and a multihead attention mechanism are uti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 48 publications
(66 reference statements)
0
4
0
Order By: Relevance
“…Utilizing the XUAMP dataset proposed by Xu et al (2021) we compared deepAMPNet with 11 state-of-the-art AMPs predictors including: amPEPpy ( Lawrence et al, 2021 ), AMPfun ( Chung et al, 2019 ), AMPEP ( Bhadra et al, 2018 ), ADAM-HMM ( Lee et al, 2015 ), ampir ( Fingerhut et al, 2021 ), AMPScannerV2 ( Veltri, Kamath & Shehu, 2018 ), AMPGram ( Burdukiewicz et al, 2020 ), Deep-AMPEP30 ( Yan et al, 2020 ), CAMP-ANN ( Waghu et al, 2016 ), sAMPpred-GAT ( Yan et al, 2023 ) and AMPpred-MFA ( Li et al, 2023 ). To avoid overestimating the performance of deepAMPNet, sequences with similarity above 90% with the XUAMP dataset were removed from the deepAMPNet training dataset.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Utilizing the XUAMP dataset proposed by Xu et al (2021) we compared deepAMPNet with 11 state-of-the-art AMPs predictors including: amPEPpy ( Lawrence et al, 2021 ), AMPfun ( Chung et al, 2019 ), AMPEP ( Bhadra et al, 2018 ), ADAM-HMM ( Lee et al, 2015 ), ampir ( Fingerhut et al, 2021 ), AMPScannerV2 ( Veltri, Kamath & Shehu, 2018 ), AMPGram ( Burdukiewicz et al, 2020 ), Deep-AMPEP30 ( Yan et al, 2020 ), CAMP-ANN ( Waghu et al, 2016 ), sAMPpred-GAT ( Yan et al, 2023 ) and AMPpred-MFA ( Li et al, 2023 ). To avoid overestimating the performance of deepAMPNet, sequences with similarity above 90% with the XUAMP dataset were removed from the deepAMPNet training dataset.…”
Section: Resultsmentioning
confidence: 99%
“…We comprehensively evaluated the performance of deepAMPNet on the MFA_test dataset with other 11 state-of-the-art AMPs predictors including: ampir ( Fingerhut et al, 2021 ), CAMP3 (RF), CAMP3 (SVM), CAMP3 (DA), CAMP3 (ANN) ( Waghu et al, 2016 ), iAMPpred ( Meher et al, 2017 ), AMPScannerV2 ( Veltri, Kamath & Shehu, 2018 ), AI4AMP ( Lin et al, 2021 ), AMPlify ( Li et al, 2022 ), AMPfun ( Chung et al, 2019 ) and AMPpred-MFA ( Li et al, 2023 ). We present ROC curves of all predictors in Fig.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…( a ) protein structure prediction and functional annotation . A variety of intelligent frameworks were introduced for predicting the structures of blood-brain barrier-penetrating peptide, antimicrobial peptide, and intrinsically disordered protein. Elia Venanzi et al applied a machine learning method that integrated protein structure, sequence, and dynamics to predict the enzymatic activity of bovine intestinal kinase variants. Zachary Smith et al identified druggable binding sites of protein target using graph neural networks with attention.…”
mentioning
confidence: 99%