2023 10th International Conference on Advanced Informatics: Concept, Theory and Application (ICAICTA) 2023
DOI: 10.1109/icaicta59291.2023.10390269
|View full text |Cite
|
Sign up to set email alerts
|

Simple Hack for Transformers Against Heavy Long-Text Classification on a Time- and Memory-Limited GPU Service

Mirza Alim Mutasodirin,
Radityo Eko Prasojo,
Achmad F. Abka
et al.

Abstract: Many NLP researchers rely on free computational services, such as Google Colab, to fine-tune their Transformer models, causing a limitation for hyperparameter optimization (HPO) in long-text classification due to the method having quadratic complexity and needing a bigger resource. In Indonesian, only a few works were found on long-text classification using Transformers. Most only use a small amount of data and do not report any HPO. In this study, using 18k news articles, we investigate which pretrained model… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 26 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?