2021
DOI: 10.1016/j.ab.2021.114416
|View full text |Cite
|
Sign up to set email alerts
|

Identification of efflux proteins based on contextual representations with deep bidirectional transformer encoders

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…These models were pre-trained exclusively using a plain text corpus. Learning representations of language sentences can provide effective feature representations for protein sequences [ 27 ]. This study employed the BERT-base-uncased pre-trained model, which consists of 12 layers, 768 hidden units, and 12 attention heads, and requires 110 million parameters.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…These models were pre-trained exclusively using a plain text corpus. Learning representations of language sentences can provide effective feature representations for protein sequences [ 27 ]. This study employed the BERT-base-uncased pre-trained model, which consists of 12 layers, 768 hidden units, and 12 attention heads, and requires 110 million parameters.…”
Section: Methodsmentioning
confidence: 99%
“…In another study, Zhang et al leveraged the pre-training strategy in the field of antibacterial peptide prediction, developing a novel method for antibacterial peptide recognition based on BERT [ 26 ]. Taju et al used the contextualized word embeddings from BERT and the support vector machine classifier to identify efflux proteins [ 27 ]. Moreover, Charoenkwan et al introduced a BERT-based model, BERT4Bitter, that enhances the prediction of bitter peptides solely based on their amino acid sequence [ 23 ].…”
Section: Introductionmentioning
confidence: 99%
“…Membrane proteins [50], efflux proteins [51], ion transporters [52], transient receptor potential channels [24], glucose transporters [53], α-helical transmembrane proteins [54] 2.6 Related works with different features…”
Section: Features Related Workmentioning
confidence: 99%