2021
DOI: 10.1016/j.compbiomed.2021.104259
|View full text |Cite
|
Sign up to set email alerts
|

GT-Finder: Classify the family of glucose transporters with pre-trained BERT language models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(5 citation statements)
references
References 44 publications
0
5
0
Order By: Relevance
“…We used the BERT model 21 , 22 to recognize language entities, entity span, semantic type of entities, and semantic relationships between language entities. BERT relies on a transformer, an attention mechanism for learning the contextual relationships between words in a text.…”
Section: Methodsmentioning
confidence: 99%
“…We used the BERT model 21 , 22 to recognize language entities, entity span, semantic type of entities, and semantic relationships between language entities. BERT relies on a transformer, an attention mechanism for learning the contextual relationships between words in a text.…”
Section: Methodsmentioning
confidence: 99%
“…BERT-based models have been used in the biomedical domain ( Zhang et al, 2019 ; Sun et al, 2021 ). Furthermore, the BERT technique has been applied to protein’s amino acid sequences to generate crucial feature representations used in different downstream tasks ( Ali Shah et al, 2021 ; Ali Shah and Ou, 2021 ; Charoenkwan et al, 2021 ). Thus, we applied a BERT-based embedding model ( Vaswani et al, 2017 ; Devlin et al, 2018 ) called ProtTrans ( Elnaggar et al, 2022 ) to automatically extract crucial features from the amino-acid sequences that capture the most significant properties for each gene in our dataset.…”
Section: Methodsmentioning
confidence: 99%
“…Transformers' Bidirectional Encoder Representations BERT, among other pre-trained language models, significantly outperforms industry benchmarks in eleven NLP tasks, including sentence-level sentiment classification, setting a new standard for text representation [69]. To capture the semantics and context of words, BERT used the notion of contextualised word embedding [70]. BERT has demonstrated to be a straightforward yet effective language model that performed at a high level on eleven NLP tasks [71].…”
Section: Nlp Methods Based On Bert For Classificationmentioning
confidence: 99%