2016 IEEE Spoken Language Technology Workshop (SLT) 2016
DOI: 10.1109/slt.2016.7846296
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing neural network hyperparameters with Gaussian processes for dialog act classification

Abstract: Systems based on artificial neural networks (ANNs) have achieved state-of-the-art results in many natural language processing tasks. Although ANNs do not require manually engineered features, ANNs have many hyperparameters to be optimized. The choice of hyperparameters significantly impacts models' performances. However, the ANN hyperparameters are typically chosen by manual, grid, or random search, which either requires expert experiences or is computationally expensive. Recent approaches based on Bayesian op… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…More specifically, we optimize the hyperparameters listed in Table 1 [37] with an implementation of Gaussian process-based Bayesian optimization provided by the GPyOpt Python library version 1.2.1 [38]. Bayesian optimization constructs a probabilistic model of the function mapping from hyperparameter settings to the model performance and provides a systematic way to explore the space more efficiently [39].…”
Section: Parameters Settingmentioning
confidence: 99%
“…More specifically, we optimize the hyperparameters listed in Table 1 [37] with an implementation of Gaussian process-based Bayesian optimization provided by the GPyOpt Python library version 1.2.1 [38]. Bayesian optimization constructs a probabilistic model of the function mapping from hyperparameter settings to the model performance and provides a systematic way to explore the space more efficiently [39].…”
Section: Parameters Settingmentioning
confidence: 99%
“…HPO is the process of optimising a loss function over a graph-structured configuration space [4]; the aim is to maximise or minimise a given function [39]. Determining appropriate values for the hyperparameters is fundamental in finding an optimal solution, however, it is a frustratingly difficult task [18,10,8,9] and the performance of NNs crucially depend on the hyper-parameters used [9,6]. For example, the internal structure is a key factor in determining the efficiency of the NN [27].…”
Section: Hyper-parameter Optimisationmentioning
confidence: 99%
“…There has been a substantial amount of research dedicated to urban water demand management, which is particularly essential in developing countries as they often suffer from high rates of urbanisation. The majority of existing research on urban water demand management have focused on the residential sector, for example, demand forecasting (Adamowski et al, 2012;Bougadis et al, 2005;Donkor et al, 2012;Ghiassi et al, 2017;Ren and Li, 2016), demand modelling (Gurung et al, 2014;Jacobs and Haarhoff, 2004), general demand management (Kenney et al, 2008), and water usage management interventions (Datta et al, 2015;Dernoncourt and Lee, 2016;Fielding et al, 2012). There is limited research on the water demand in the non-residential or educational sectors despite the fact that these sectors can be high water consumers (Sánchez-Torija et al, 2017).…”
Section: Introductionmentioning
confidence: 99%