2021
DOI: 10.1101/2021.11.17.468929
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Predicting SARS-CoV-2 epitope-specific TCR recognition using pre-trained protein embeddings

Abstract: The COVID-19 pandemic is ongoing because of the high transmission rate and the emergence of SARS-CoV-2 variants. The P272L mutation in SARS-Cov-2 S-protein is known to be highly relevant to the viral escape associated with the second pandemic wave in Europe. Epitope-specific T-cell receptor (TCR) recognition is a key factor in determining the T-cell immunogenicity of a SARS-CoV-2 epitope. Although several data-driven methods for predicting epitope-specific TCR recognition have been proposed, they remain challe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…Recently, the birth of Bidirectional Encoder Representations from Transformers (BERT) models revolutionized the natural language processing (NLP) field 18 . Because of the similarities between protein sequences and language sentences, several studies applied BERT models to TCR-related tasks 19,20 . TCR-BERT 19 utilized over 8,000 TCR sequences on the self-supervised learning tasks, and showed promising performance on the downstream tasks, such as TCR-antigen binding affinity prediction and sequence clustering.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, the birth of Bidirectional Encoder Representations from Transformers (BERT) models revolutionized the natural language processing (NLP) field 18 . Because of the similarities between protein sequences and language sentences, several studies applied BERT models to TCR-related tasks 19,20 . TCR-BERT 19 utilized over 8,000 TCR sequences on the self-supervised learning tasks, and showed promising performance on the downstream tasks, such as TCR-antigen binding affinity prediction and sequence clustering.…”
Section: Introductionmentioning
confidence: 99%
“…TCR-BERT 19 utilized over 8,000 TCR sequences on the self-supervised learning tasks, and showed promising performance on the downstream tasks, such as TCR-antigen binding affinity prediction and sequence clustering. Han et al 20 presented a BERT-based model by fine-tuning the pre-trained Tasks Assessing Protein Embeddings (TAPE) 21 model to predict SARS-CoV-2 T-cell epitope-specific TCR recognition. However, to the best of our knowledge, all the existing approaches only focused on the modeling of natural TCR sequences.…”
Section: Introductionmentioning
confidence: 99%