Proceedings of the 2022 SIAM Conference on Parallel Processing for Scientific Computing (PP) 2022
DOI: 10.1137/1.9781611977141.1
|View full text |Cite
|
Sign up to set email alerts
|

GPTuneBand: Multi-task and Multi-fidelity Autotuning for Large-scale High Performance Computing Applications

Abstract: This work proposes a novel multi-task and multi-fidelity autotuning framework, GPTuneBand, for tuning large-scale expensive high performance computing (HPC) applications. GPTuneBand combines a multi-task Bayesian optimization algorithm with a multi-armed bandit strategy, well-suited for tuning expensive HPC applications such as numerical libraries, scientific simulation codes and machine learning (ML) models, particularly with a very limited tuning budget. Our numerical results show that compared to other stat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 15 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?