2014
DOI: 10.1145/2536688
|View full text |Cite
|
Sign up to set email alerts
|

Automatic feature generation for machine learning--based optimising compilation

Abstract: Recent work has shown that machine learning can automate and in some cases outperform handcrafted compiler optimisations. Central to such an approach is that machine learning techniques typically rely upon summaries or features of the program. The quality of these features is critical to the accuracy of the resulting machine learned algorithm; no machine learning method will work well with poorly chosen features. However, due to the size and complexity of programs, theoretically there are an infinite number of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
3

Relationship

3
3

Authors

Journals

citations
Cited by 23 publications
(12 citation statements)
references
References 21 publications
0
12
0
Order By: Relevance
“…It searches for different combinations of code transformations and transformation parameters, evaluates their effect on performance, and at the end keeps the best performing binary. Multiple approaches exist to select optimization sequences: random, through genetic search, statistical models [41], or machine learned models [33]. Despite the significant performance benefits of iterative compilation, it has not been applied in a general way on mobile systems.…”
Section: Motivationmentioning
confidence: 99%
“…It searches for different combinations of code transformations and transformation parameters, evaluates their effect on performance, and at the end keeps the best performing binary. Multiple approaches exist to select optimization sequences: random, through genetic search, statistical models [41], or machine learned models [33]. Despite the significant performance benefits of iterative compilation, it has not been applied in a general way on mobile systems.…”
Section: Motivationmentioning
confidence: 99%
“…and each feature extraction requires 3 seconds for a human being then such program would consume 3000 seconds or 50 minutes for complete feature extraction so that those features could be further used in machine learning algorithm. Since feature extraction from program source-code using NLP is heavily utilized in compiler optimization and thus someone could argue that the field of compiler optimization has improved a lot in past couple of years [24], [25]. However, given the emergence of specialized frameworks and directives such as OpenMP, OpenCL, OpenVX, modern automated code feature extraction methods [1]- [4] are still lacking in pace in terms of accurately extracting such features in a completely automated manner.…”
Section: Motivational Case Study a Traditional Feature Extractiomentioning
confidence: 99%
“…In [46], loop tiling is applied with fixed tile sizes. In [29] [28] and [50], only loop unroll transformation is applied. As a consequence, a very large number of solutions is not tested.…”
Section: Related Workmentioning
confidence: 99%
“…The most promising automatic approach is iterative compilation where many different versions of the program are generated-executed by applying a set of compiler transformations, at all different combinations/sequences. However, iterative compilation is extremely expensive in terms of compilation time and therefore researchers and current compilers try to reduce it by using i) both iterative compilation and machine learning compilation techniques [29] [5] [54] [38], ii) both iterative compilation and genetic algorithms [26], iii) heuristics and empirical methods [12], iv) both iterative compilation and statistical techniques [17], v) exhaustive search [25]. However, by employing these approaches, the remaining exploration space of code optimizations, i.e., the set of all optimization configurations that have to be explored (optimization sets), is still so large that searching is impractical.…”
Section: Introductionmentioning
confidence: 99%