2017 14th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS) 2017
DOI: 10.1109/avss.2017.8078499
|View full text |Cite
|
Sign up to set email alerts
|

Hyper-Optimization tools comparison for parameter tuning applications

Abstract: This paper evaluates and compares different hyperparameters optimization tools that can be used in any vision applications for tuning their underlying free parameters. We focus in the problem of multiple object tracking, as it is widely studied in the literature and offers several parameters to tune. The selected tools are freely available or easy to implement. In this paper we evaluate the impact of parameter optimization tools over the tracking performances using videos from public datasets. Also, we discuss… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…Thanks to the efforts of recent studies, today's database practitioners are well-equipped with many techniques of configuration tuning. However, when we look from a broader view, we can find plenty of toolkits and algorithms designed for the blackbox optimization, especially hyper-parameter optimization (HPO) approaches [6,30,49,60,82,89]. HPO aims to find the optimal hyper-parameter configurations of a machine learning algorithm as rapidly as possible to minimize the corresponding loss function.…”
Section: Motivationmentioning
confidence: 99%
“…Thanks to the efforts of recent studies, today's database practitioners are well-equipped with many techniques of configuration tuning. However, when we look from a broader view, we can find plenty of toolkits and algorithms designed for the blackbox optimization, especially hyper-parameter optimization (HPO) approaches [6,30,49,60,82,89]. HPO aims to find the optimal hyper-parameter configurations of a machine learning algorithm as rapidly as possible to minimize the corresponding loss function.…”
Section: Motivationmentioning
confidence: 99%
“…Instead, we use a hyper-parameter optimization method: SMAC [7], which has up to 900 citations. It has proven already to increase the performances of computer vision algorithms by tuning hyper-parameters [4,19]. In order to find the set of parameters values, it requires a measure of the algorithm's performance.…”
Section: Methodsmentioning
confidence: 99%
“…For hyper parameter optimization, the paper [19] compared TPE (Tree-structured Parzen Estimator), MCMC (Markov chain Monte Carlo) and SMAC (Sequential Modelbased Algorithm Configuration) optimisation techniques and found TPE to be around 4% more accurate than the other optimisers. There are few papers comparing different hyper parameter optimisation algorithms.…”
Section: Max Flames = Round (N-i*(n-1)/t) (1)mentioning
confidence: 99%