2020
DOI: 10.1111/coin.12350
|View full text |Cite
|
Sign up to set email alerts
|

Toward classifying small lung nodules with hyperparameter optimization of convolutional neural networks

Abstract: Among all cancer‐related deaths, lung cancer leads all indicators, accounting for approximately 20% of all types. Patients diagnosed in the early stages have a 1‐year survival rate of 81% to 85%, while in an advanced stage have 15% to 19% chances of survival. The primary manifestation of this cancer is through pulmonary nodule on computed tomography images. In the early stages, it is a complex task even for experienced specialists and presents some challenges to classify these nodules in benign or malignant. S… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 54 publications
0
9
0
Order By: Relevance
“…Due to it, we consider the hyper-parameter tuning as the essential task of this research and the main goal of it is to improve the baseline approach (with the initial ANN architecture and initial hyper-parameter values chosen by the human expert according to the theoretical insights) by the significant margin. The examples of methods used for optimizing ANN hyper-parameters include various nature-inspired heuristics such as monarch butterfly optimization , swarm intelligence , Bayesian optimization (Cho et al, 2020), multi-threaded training (Połap et al, 2018), evolutionary optimization (Cui & Bai, 2019), genetic algorithm (Han et al, 2020), harmony search algorithm (Kim, Geem & Han, 2020), simulated annealing (Lima, Ferreira Junior & Oliveira, 2020), Pareto optimization (Plonis et al, 2020), gradient descent optimization of a directed acyclic graph (Zhang et al, 2020) and others.…”
Section: Introductionmentioning
confidence: 99%
“…Due to it, we consider the hyper-parameter tuning as the essential task of this research and the main goal of it is to improve the baseline approach (with the initial ANN architecture and initial hyper-parameter values chosen by the human expert according to the theoretical insights) by the significant margin. The examples of methods used for optimizing ANN hyper-parameters include various nature-inspired heuristics such as monarch butterfly optimization , swarm intelligence , Bayesian optimization (Cho et al, 2020), multi-threaded training (Połap et al, 2018), evolutionary optimization (Cui & Bai, 2019), genetic algorithm (Han et al, 2020), harmony search algorithm (Kim, Geem & Han, 2020), simulated annealing (Lima, Ferreira Junior & Oliveira, 2020), Pareto optimization (Plonis et al, 2020), gradient descent optimization of a directed acyclic graph (Zhang et al, 2020) and others.…”
Section: Introductionmentioning
confidence: 99%
“…In order to optimize hyperparameters in CNN, various approaches such as adaptive gradient optimizer [ 24 ], Adam optimizer [ 25 ], Bayesian optimization [ 26 ], equilibrium optimization [ 27 ], evolutionary algorithm [ 28 ], genetic algorithm [ 29 ], grid search [ 30 ], particle swarm optimization [ 31 , 32 ], random search [ 30 , 33 ], simulating annealing [ 33 ], and tree-of-parzen estimators [ 33 ], whale optimization algorithm [ 34 ], and weighted random search [ 35 ] have been so far proposed. random search, simulating annealing, and tree-of-parzen estimators.…”
Section: Related Workmentioning
confidence: 99%
“…Lima [ 33 ] compared various hyperparameter optimization algorithms such as random search, simulating annealing, and tree-of-parzen estimators in order to find the most effective CNN architecture in the classification of benign and malignant small pulmonary nodules. Kumar and Hati [ 24 ] proposed the adaptive gradient optimizer–based deep convolutional neural network (ADG-dCNN) approach for bearing and rotor faults detection in squirrel cage induction motor.…”
Section: Related Workmentioning
confidence: 99%
“…Consequently, random hyperparameter search is applied most frequently and has proven to be effective and more efficient than grid-search for DL [ 8 ]. Hyperparameter optimization is present in medical imaging research, where it has been used to obtain models with higher diagnostic performance for a range of problems [ 9 , 10 ].…”
Section: Introductionmentioning
confidence: 99%