2024
DOI: 10.1021/acs.jcim.4c00689
|View full text |Cite
|
Sign up to set email alerts
|

Simple, Efficient, and Scalable Structure-Aware Adapter Boosts Protein Language Models

Yang Tan,
Mingchen Li,
Bingxin Zhou
et al.

Abstract: Fine-tuning pretrained protein language models (PLMs) has emerged as a prominent strategy for enhancing downstream prediction tasks, often outperforming traditional supervised learning approaches. As a widely applied powerful technique in natural language processing, employing parameter-efficient fine-tuning techniques could potentially enhance the performance of PLMs. However, the direct transfer to life science tasks is nontrivial due to the different training strategies and data forms. To address this gap, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
references
References 60 publications
0
0
0
Order By: Relevance