2010 IEEE International Symposium on Parallel &Amp; Distributed Processing (IPDPS) 2010
DOI: 10.1109/ipdps.2010.5470479
|View full text |Cite
|
Sign up to set email alerts
|

Offline library adaptation using automatically generated heuristics

Abstract: Abstract-Automatic tuning has emerged as a solution to provide high-performance libraries for fast changing, increasingly complex computer architectures. We distinguish offline adaptation (e.g., in ATLAS) that is performed during installation without the full problem description from online adaptation (e.g., in FFTW) that is performed at runtime. Offline adaptive libraries are simpler to use, but, unfortunately, writing the adaptation heuristics that power them is a daunting task. The overhead of online adapti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0
1

Year Published

2011
2011
2019
2019

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 16 publications
0
10
0
1
Order By: Relevance
“…Examples of such systems include ATLAS [37], PhiPAC [38], and OSKI [11] for linear algebra, [39], FFTW [40] and SPIRAL [41] for signal processing, [42], [43], [44] for stencil computations, and [45] for sorting.…”
Section: Related Workmentioning
confidence: 99%
“…Examples of such systems include ATLAS [37], PhiPAC [38], and OSKI [11] for linear algebra, [39], FFTW [40] and SPIRAL [41] for signal processing, [42], [43], [44] for stencil computations, and [45] for sorting.…”
Section: Related Workmentioning
confidence: 99%
“…Compiler researchers have proposed to use machine learning models to focus search on beneficial areas of the optimization search space [6,10,11,14,20,25]. An important step in using these models is to characterize (or construct features for) the programs being optimized.…”
Section: Characterizing the Programmentioning
confidence: 99%
“…However, many of the proposed techniques for exploring optimizations (e.g., genetic algorithms [5], random search [21], statistical techniques [18], or exhaustive search [20]) are expensive, which limits their practical use. This has led compiler researchers to propose using "intelligent" prediction models that focus exploration to beneficial areas of the optimization search space [4,7,11,13,22]. Prediction models can reduce the cost of finding good optimizations, but increase complexity in the design of the search function because models require characterizing the program being optimized (e.g., with source code features or performance counters), generation of training data, and a training phase.…”
Section: Learning To Optimizementioning
confidence: 99%