2021
DOI: 10.7717/peerj.10813
|View full text |Cite
|
Sign up to set email alerts
|

6mA-Pred: identifying DNA N6-methyladenine sites based on deep learning

Abstract: With the accumulation of data on 6mA modification sites, an increasing number of scholars have begun to focus on the identification of 6mA sites. Despite the recognized importance of 6mA sites, methods for their identification remain lacking, with most existing methods being aimed at their identification in individual species. In the present study, we aimed to develop an identification method suitable for multiple species. Based on previous research, we propose a method for 6mA site recognition. Our experiment… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 67 publications
0
2
0
Order By: Relevance
“…Therefore, the first step is to convert a one-dimensional DNA sequence to a group of short fragments. Although one-hot encoding ( Abbas, Tayara & Chong, 2021 ) can represent each base of DNA as a binary bit, it cannot provide the sequence orders or measure the distance ( Huang et al, 2021 ) between related words. In this article, word embedding and GloVe algorithm are used to better model the relationship in DNA sequences.…”
Section: Methodsmentioning
confidence: 99%
“…Therefore, the first step is to convert a one-dimensional DNA sequence to a group of short fragments. Although one-hot encoding ( Abbas, Tayara & Chong, 2021 ) can represent each base of DNA as a binary bit, it cannot provide the sequence orders or measure the distance ( Huang et al, 2021 ) between related words. In this article, word embedding and GloVe algorithm are used to better model the relationship in DNA sequences.…”
Section: Methodsmentioning
confidence: 99%
“…In addition to CNN and LSTM, an attention mechanism, which is incorporated in recent powerful neural network models such as Transformer [21] and Bidirectional Encoder Representations from Transformers (BERT) [22] , has achieved remarkable improvements in the development of neural networks. Huang et al combined LSTM with attention mechanisms and presented comparable performances to SICD6mA in 6 mA site prediction [23] . Yu et al constructed the BERT-based neural network model, named iDNA-ABT [24] , and compared it with the previous models including iDNA-MS and SNNRice6mA on the benchmark datasets constructed by Lv et al (iDNA-MS).…”
Section: Introductionmentioning
confidence: 99%