Proceedings of the 26th ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming 2021
DOI: 10.1145/3437801.3446108
|View full text |Cite
|
Sign up to set email alerts
|

ApproxTuner

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…ApproxTuner [15] delivers heuristic-based search of the space of possible approximations of each individual network layer, so that a comprehensive speedup-inference accuracy trade-off curve is charted and the list of the most promising sets of approximations is identified. Yet, ApproxTuner does not take into account the peculiarities of the mobile platform and the predicted trade-off curves it draws do not reflect the actual performance observed on the mobiles.…”
Section: Mobiprox Builds Upon the Existing Work On Heterogeneous And ...mentioning
confidence: 99%
See 1 more Smart Citation
“…ApproxTuner [15] delivers heuristic-based search of the space of possible approximations of each individual network layer, so that a comprehensive speedup-inference accuracy trade-off curve is charted and the list of the most promising sets of approximations is identified. Yet, ApproxTuner does not take into account the peculiarities of the mobile platform and the predicted trade-off curves it draws do not reflect the actual performance observed on the mobiles.…”
Section: Mobiprox Builds Upon the Existing Work On Heterogeneous And ...mentioning
confidence: 99%
“…However, ApproxHPVM targets server environments, generates only CUDA-ready binaries, and does not support compilation for mobile hardware (Android or iOS). With the help of Approx-Tuner [15], approximation levels within ApproxHPVM can be dynamically adapted, yet, the provided adaptation method is simple, reactive and context-oblivious.…”
Section: Introductionmentioning
confidence: 99%
“…ApproxTuner [40] adds support for heuristic-based search of the space of possible approximations of each individual network layer, so that a comprehensive speedup-inference accuracy trade-off curve is charted and the list of the most promising sets of approximations is identified.…”
Section: Preliminariesmentioning
confidence: 99%
“…for its deep learning models. To achieve this, for a given pre-trained neural network, Mobiprox, relying on ApproxTuner [40], first investigates the effect of different per NN-operation approximations on the inference accuracy and identifies speedup-inference accuracy trade-off points. Each of these points is realized through a different set of approximations (and is termed: configuration) applied to layers of a neural network.…”
Section: Mobiprox Framework 41 Overviewmentioning
confidence: 99%