2021
DOI: 10.1021/acs.jproteome.1c00410
|View full text |Cite|
|
Sign up to set email alerts
|

A Multitask Deep-Learning Method for Predicting Membrane Associations and Secondary Structures of Proteins

Abstract: Prediction of residue-level structural attributes and protein-level structural classes helps model protein tertiary structures and understand protein functions. Existing methods are either specialized on only one class of proteins or developed to predict only a specific type of residue-level attribute. In this work, we develop a new deep-learning method, named Membrane Association and Secondary Structure Predictor (MASSP), for accurately predicting both residue-level structural attributes (secondary structure,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 62 publications
(108 reference statements)
0
7
0
Order By: Relevance
“…Rather than focusing on single property prediction, a few studies have sought to predict a number of properties in combination, such as solvent accessibility, secondary structures, and torsion angles. These methods include AllesTM [242] , MASSP [243] , and TopProperty [244] , which all use deep learning methods to keep abreast of any possible advances in prediction performance ( Table 6 ). For example, in the AllesTM work, the ensemble of conventional machine learning methods (random forest) and deep learning methods (CNNs and bidirectional LSTM NNs) leads to superior performance in predicting Z-coordinates, flexibility, and topology, and its performance in predicting torsion angles, secondary structures, and monomer relative solvent accessibility is roughly similar to that of SPOT-1D.…”
Section: Prediction Of Multiple Properties With Metamethodsmentioning
confidence: 99%
“…Rather than focusing on single property prediction, a few studies have sought to predict a number of properties in combination, such as solvent accessibility, secondary structures, and torsion angles. These methods include AllesTM [242] , MASSP [243] , and TopProperty [244] , which all use deep learning methods to keep abreast of any possible advances in prediction performance ( Table 6 ). For example, in the AllesTM work, the ensemble of conventional machine learning methods (random forest) and deep learning methods (CNNs and bidirectional LSTM NNs) leads to superior performance in predicting Z-coordinates, flexibility, and topology, and its performance in predicting torsion angles, secondary structures, and monomer relative solvent accessibility is roughly similar to that of SPOT-1D.…”
Section: Prediction Of Multiple Properties With Metamethodsmentioning
confidence: 99%
“…Proteins from the Genbank database [80] were used as training data to build the PSSP model developed by Xavier and Thirunavukarasu [81] . Li et al [82] predicted the secondary structure of transmembrane proteins and took the protein sequences from the OPM database [83] .…”
Section: Datamentioning
confidence: 99%
“…CNN-2D is used in PSSP model architecture to extract temporal and spatial features of the input sequences better. Feature vectors (PSSM and one-hot encoding) of a fixed length residue window were employed in [102] , [103] , [104] , [82] as input to the two-dimensional CNN.…”
Section: Pssp In Pre-alphafold Publicationmentioning
confidence: 99%
See 1 more Smart Citation
“…However, there are only a few methods for the merge prediction of the topological and secondary structures of TMPs. TMPSS [ 23 ] and MASSP [ 24 ] achieved simultaneous prediction by using the DL of Convolutional Neural Networks (CNNs) or Long Short-Term Memory (LSTM) layers, and such studies usually employed multi-task learning to predict structures separately, creating conflicting prediction results. Furthermore, DL is not well suited for modeling temporal phenomena.…”
Section: Introductionmentioning
confidence: 99%