2021
DOI: 10.3389/fcell.2021.662983
|View full text |Cite
|
Sign up to set email alerts
|

DTL-DephosSite: Deep Transfer Learning Based Approach to Predict Dephosphorylation Sites

Abstract: Phosphorylation, which is mediated by protein kinases and opposed by protein phosphatases, is an important post-translational modification that regulates many cellular processes, including cellular metabolism, cell migration, and cell division. Due to its essential role in cellular physiology, a great deal of attention has been devoted to identifying sites of phosphorylation on cellular proteins and understanding how modification of these sites affects their cellular functions. This has led to the development … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 34 publications
0
10
0
Order By: Relevance
“…We next demonstrated the application of DeepET in a transfer learning approach 35 . In transfer learning, a model pre‐trained on a large (source) data set, such as DeepET, is re‐purposed to another similar (target) problem from the same or a related domain with a smaller amount of training samples, by (a) further training and thus fine‐tuning certain layers or (b) resetting their weights and training them from scratch 36,37 . This is particularly useful for biological data sets since (a) large numbers of biological samples are expensive to collect and (b) the capacity of classical ML models like random forest are usually limited by the availability of relevant features 26 .…”
Section: Resultsmentioning
confidence: 99%
“…We next demonstrated the application of DeepET in a transfer learning approach 35 . In transfer learning, a model pre‐trained on a large (source) data set, such as DeepET, is re‐purposed to another similar (target) problem from the same or a related domain with a smaller amount of training samples, by (a) further training and thus fine‐tuning certain layers or (b) resetting their weights and training them from scratch 36,37 . This is particularly useful for biological data sets since (a) large numbers of biological samples are expensive to collect and (b) the capacity of classical ML models like random forest are usually limited by the availability of relevant features 26 .…”
Section: Resultsmentioning
confidence: 99%
“…Along with predicting conventional PTMs associated with functional group addition, deep learning-based methods have also been applied to predict niche-type PTMs; for instance, Chaudhari et al developed a transfer learning-based predictor (DTL-DephosSite) for dephosphorylation site prediction [127] . To collect datasets of S, T, and Y dephosphorylation sites, they integrated the experimentally verified datasets from the literature and datasets from the DEPOD database.…”
Section: Other Ptmsmentioning
confidence: 99%
“… 2020 [122] DeepGlut Glutarylation Prokaryotes and Eukaryote CNN 10-fold CV 4,572 * https://github.com/urmisen/DeepGlut . 2020 [123] NPalmitoylDeep-PseAAC N-Palmitoylation Human DNN holdout 4,364 https://mega.nz/#F!s9cSiQIa!1jXO0NmgrhxUqOexmYuouA 2021 [124] DTL-DephosSite Dephosphorylation Human Bi-LSTM 5-fold CV and independent test 4,956 https://github.com/dukkakc/DTLDephos 2021 [127] PreCar_Deep Carbonylation Human and other Mammals CNN + BiLSTM 10-fold CV and independent test 5,003 https://github.com/QUST-SHULI/PreCar_Deep/ 2021 [125] He et al's work SUMOylation Ubiquitylation CNN + DNN 10-fold CV 280,731 https://github.com/lijingyimm/MultiUbiSUMO 2021 [126] Note: *, Link is not working at the time of writing. Multiple, more than three species or PTM types.…”
Section: Other Ptmsmentioning
confidence: 99%
“…We next demonstrated the application of DeepET in a transfer learning approach 30 . In transfer learning, a model pre-trained on a large dataset, such as DeepET, is re-purposed to another similar problem from the same domain with a smaller amount of training samples, by (i) further training and thus fine-tuning certain layers or (ii) resetting their weights and training them from scratch 31,32 . This is particularly useful for biological datasets since (i) large numbers of biological samples are expensive to collect and (ii) the capacity of classical machine learning models like random forest are usually limited by the availability of relevant features 26 .…”
Section: Learning Representations Of Enzyme Thermal Adaptationmentioning
confidence: 99%