2019 IEEE 27th International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems (MASCOTS 2019
DOI: 10.1109/mascots.2019.00045
|View full text |Cite
|
Sign up to set email alerts
|

Practical Design Space Exploration

Abstract: Multi-objective optimization is a crucial matter in computer systems design space exploration because real-world applications often rely on a trade-off between several objectives. Derivatives are usually not available or impractical to compute and the feasibility of an experiment can not always be determined in advance. These problems are particularly difficult when the feasible region is relatively small, and it may be prohibitive to even find a feasible experiment, let alone an optimal one.We introduce a new… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
54
0
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 62 publications
(56 citation statements)
references
References 28 publications
0
54
0
2
Order By: Relevance
“…Hundreds of metrics are reportedly used for a single experiment at Microsoft (Kevic et al 2017;Machmouchi and Buscher 2016). For optimization, multi-objective optimization (Nardi et al 2019;Sun et al 2018) can be applied offline at compile time, where someone can manually make a trade-off between metrics from a Pareto front. However, for the online bandit optimization algorithms, a single metric is required to serve as rewards (though that metric can be a scalar index).…”
Section: Considerations For What Metric and Changes To Optimize Formentioning
confidence: 99%
“…Hundreds of metrics are reportedly used for a single experiment at Microsoft (Kevic et al 2017;Machmouchi and Buscher 2016). For optimization, multi-objective optimization (Nardi et al 2019;Sun et al 2018) can be applied offline at compile time, where someone can manually make a trade-off between metrics from a Pareto front. However, for the online bandit optimization algorithms, a single metric is required to serve as rewards (though that metric can be a scalar index).…”
Section: Considerations For What Metric and Changes To Optimize Formentioning
confidence: 99%
“…A seguir, treinamos o algoritmo SVM de kernel linear aplicando o método de relevância binária para obter classificações multi-class e multi-label. Nesse processo, otimizamos ainda o hiperparâmetro de regularização do SVM (C) através de 35 iterações de otimização bayesiana [6].…”
Section: Abordagem De Classificação Via Svmunclassified
“…Nessas abordagens, as palavras são representadas através de embeddings. Otimizamos ainda os principais hiperparâmetros dessas redes através de 100 iterações de otimização bayesiana [6]. Após obtidos, utilizamos os melhores hiperparâmetros para treinar o modelo final e testá-lo.…”
Section: Abordagens Baseadas Em Redes Neuraisunclassified
“…HyperMapper 2.0 [60], [61], [62] is designed for heterogeneous workloads, and can handle complex design spaces consisting of multiple objectives, categorical/ordinal variables, unknown feasibility constraints [63], and exploitation of performance profiling of earlier executions of workloads in Polystore++.…”
Section: Optimization Challengesmentioning
confidence: 99%
“…Moreover, building cost models with a large number of parameters may be expensive. Alternatively, active learning [61], [62], [60] can trade off exploration and exploitation mechanisms to give an approximated optimal configuration for workload execution in Polystore++ systems.…”
Section: Optimization Challengesmentioning
confidence: 99%