2020
DOI: 10.1016/j.phycom.2020.101057
|View full text |Cite
|
Sign up to set email alerts
|

Novel suboptimal approaches for hyperparameter tuning of deep neural network [under the shelf of optical communication]

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 47 publications
(16 citation statements)
references
References 36 publications
(30 reference statements)
0
15
0
1
Order By: Relevance
“…The purpose of a deep learning LSTM-autoencoder network is to gather and extract composite information from large time-series datasets using many hidden layers [ 47 ]. However, choosing suitable hyperparameters is a complex task and significantly affects the model’s performance [ 48 ]. For example, more hidden layers or more neurons in a network does not necessarily increase the performance of the network.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The purpose of a deep learning LSTM-autoencoder network is to gather and extract composite information from large time-series datasets using many hidden layers [ 47 ]. However, choosing suitable hyperparameters is a complex task and significantly affects the model’s performance [ 48 ]. For example, more hidden layers or more neurons in a network does not necessarily increase the performance of the network.…”
Section: Literature Reviewmentioning
confidence: 99%
“…However, they have high computational complexity, especially with a wide range of parameter values that need to be tuned. In this regard, other researchers have explored alternative techniques, such as the suboptimal grid search [64]. In this paper, we used the grid search technique for hyperparameter tuning for LSTM and SVR, since it was necessary to optimize a large number of models with different behaviour.…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…These variables are: The grid search methodology has been considered to determine the internal structure of both denoising approaches, i.e., their hyperparameters. This methodology has been considered as one of the most effective methods to determine the ANNs hyperparameters testing different net configurations [58,59]. Here, it has been considered to determine the number of hidden layers and hidden neurons per layer of the two denoising approaches.…”
Section: Denoising Architecturesmentioning
confidence: 99%