2017 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW) 2017
DOI: 10.1109/ipdpsw.2017.28
|View full text |Cite
|
Sign up to set email alerts
|

Online-Autotuning in the Presence of Algorithmic Choice

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
9
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 16 publications
0
9
0
Order By: Relevance
“…There are several works that rely on the Nelder-Mead algorithm for optimisation [19][20][21][22]. Koenigstein et al [19] adopt the Nelder-Mead direct search to optimise more than twenty hyperparameters.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…There are several works that rely on the Nelder-Mead algorithm for optimisation [19][20][21][22]. Koenigstein et al [19] adopt the Nelder-Mead direct search to optimise more than twenty hyperparameters.…”
Section: Related Workmentioning
confidence: 99%
“…The authors adopt Nelder-Mead to identify the optimal hyperparameter initialisation. Pfaffe et al [22] present an online auto tuning algorithm for string matching algorithms. It uses the e-greedy policy, a well-known reinforcement learning technique, to select the algorithm to be used in each iteration and adopts Nelder-Mead to tune the parameters until a pre-defined user criterion is met, e.g., number of tuning iterations.…”
Section: Related Workmentioning
confidence: 99%
“…In machine learning, the ability to select appropriate features, work flows, machine learning paradigms, algorithms, and their hyper-parameters requires expert knowledge [13]. The few contributions found in the literature addressing this progressive automation of machine learning or auto-ML include tools [1,27,9], model selection algorithms [7,6], hyper-parameter optimisation algorithms [18,10,24] and Nelder-Mead optimisation solutions [16,8,25].…”
Section: Related Workmentioning
confidence: 99%
“…In terms of automatic hyperparameter selection, there are several different techniques: (i ) particle swarm optimisation [7], which is flexible and can be applied to ensemble models [6]; (ii ) grid search [18], which minimises the estimated error until converges on a local minima; (iii) gradient-based search, e.g., Stochastic Gradient Descent (SGD), which converges to an optimal solution [24]; and (iv ) Nelder-Mead direct search, which relies on heuristics to optimise model parameters [16] or tensor based models [8]. The Nelder-Mead algorithm has been used together with exponentially decay centrifugal forces to improve the results at the cost of the number of iterations needed to converge [15] as well as with reinforcement techniques (e-greedy) to select the best model of each iteration [25].…”
Section: Related Workmentioning
confidence: 99%
“…Auto-tuning and input aware techniques [13,16] are recently used to address the problem of performance portability on different data-driven applications [11,23,32]. An interesting approach extends such techniques in the presence of multiple algorithmic choice [42]. However, their on-line solution is suitable when a specific routine is called multiple times.…”
Section: Related Workmentioning
confidence: 99%