2024
DOI: 10.21203/rs.3.rs-3487308/v2
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Self-Attention Factor-Tuning for Parameter Efficient Fine-Tuning

Jason Abohwo

Abstract: Transformers have revolutionized the fields of Natural Language Processing and Computer Vision - a result of their ability to capture long-range dependencies with their key innovation: the attention mechanism. Despite the success of these models, their growing complexity has led to an ever-increasing need for processing power, making their practical applications less feasible. In recent years, tensor decomposition-based parameter-efficient fine-tuning techniques have emerged as a promising solution to the comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 11 publications
(22 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?