2021
DOI: 10.1016/j.eswa.2021.115525
|View full text |Cite
|
Sign up to set email alerts
|

EA-based hyperparameter optimization of hybrid deep learning models for effective drug-target interactions prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(11 citation statements)
references
References 79 publications
0
11
0
Order By: Relevance
“…We also selected the popular deep learning methods for comparison, including CNN [13] , DeepLSTM, CNN-AbiLSTM [14] , MPNN [15] , and transformer. With the same data set, we conducted pairwise experiments.…”
Section: Comparison With Deep Learning Methodsmentioning
confidence: 99%
“…We also selected the popular deep learning methods for comparison, including CNN [13] , DeepLSTM, CNN-AbiLSTM [14] , MPNN [15] , and transformer. With the same data set, we conducted pairwise experiments.…”
Section: Comparison With Deep Learning Methodsmentioning
confidence: 99%
“…In addition to its comprehensivess as a searching algorithm, grid search aims to identify the most optimal values for hyperparameters through a manually specified subset of hyperparameter space [ 36 ]. However, since the grid of configurations grows exponentially depending on the number of hyperparameters during the hyperparameter optimization process, the algorithm is not often useful for the optimization of deep neural networks [ 36 ]. During the hyperparameter optimization in CNN, it may take a few hours or a whole day to evaluate a hyperparameter selection, which causes serious computational problems.…”
Section: Related Workmentioning
confidence: 99%
“…One of the main advantages in Bayesian optimization–based neural network optimization is that it does not require running neural network completely. On the other hand, its complexity and high-dimensional hyperparameter space makes Bayesian optimization an impractical and expensive approach for hyperparameter optimization [ 36 ].…”
Section: Related Workmentioning
confidence: 99%
“…Thus, the proposed model outperformed these other advanced models in terms of error minimization, thereby improving wind energy forecasting accuracy. Additionally, Mahdaddi et al [22] presented an evolutionary Int J Elec & Comp Eng ISSN: 2088-8708  method (EA) framework, namely the differential evolution (DE) algorithm, for determining the ideal configuration for the proposed model's hyper-parameters. The suggested model, CNN-attention-based bidirectional long short-term memory network (CNN-AbiLSTM), was compared using a manually built deep learning model and an EA-based hyper-parameter optimization technique.…”
Section: Introductionmentioning
confidence: 99%