2019
DOI: 10.1007/978-3-030-13001-5_12
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Optimization of Neural Architectures for Human Activity Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…To compare our baseline with the proposed hierarchical model, we make sure to get the same complexity, i.e., comparable number of parameters as the largest hierarchies including the weights of the learners. We also use Bayesian optimization based on Gaussian processes as surrogate models to select the optimal hyperparameters of the baseline model (Snoek, Larochelle, and Adams 2012;Osmani and Hamidi 2019). In addition, we compare our proposed approach with the following closely related baselines from the HAR literature:…”
Section: Methodsmentioning
confidence: 99%
“…To compare our baseline with the proposed hierarchical model, we make sure to get the same complexity, i.e., comparable number of parameters as the largest hierarchies including the weights of the learners. We also use Bayesian optimization based on Gaussian processes as surrogate models to select the optimal hyperparameters of the baseline model (Snoek, Larochelle, and Adams 2012;Osmani and Hamidi 2019). In addition, we compare our proposed approach with the following closely related baselines from the HAR literature:…”
Section: Methodsmentioning
confidence: 99%
“…For instance, in Nguyen, Liu, and Zio (2020) an accurate and reliable multi-step ahead prediction model based on LSTM, whose hyperparameters have been optimized through BO, has been validated on steam generator data acquired from different French nuclear power plants for prognostic and health management of the plants. In Osmani and Hamidi (2019), an expensive and time-consuming design of a Deep Neural Network for human activity recognition has been addressed via BO in order to optimally and efficiently tune the deep neural architectures' hyper-parameters. With respect to ADS, a recent and interesting application of BO is devoted to generate simulation scenarios in order to improve accuracy and "safety" of the ADS (Abeysirigoonawardena, Shkurti, and Dudek, 2019;Zerwas, Kalmbach, Henkel, Rétvári, Kellerer, Blenk, and Schmid, 2019;Gangopadhyay, Khastgir, Dey, Dasgupta, Montana, and Jennings, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…Hyperparameters of the architectural components and their associated ranges used for the construction of the surrogate model. From[109].…”
mentioning
confidence: 99%
“…Architectures with recurrent output layers are referred to as Hybrid while those with dense output layers are referred to as Convolutional. From[109].…”
mentioning
confidence: 99%