2016 IEEE Long Island Systems, Applications and Technology Conference (LISAT) 2016
DOI: 10.1109/lisat.2016.7494142
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of instance selection algorithms on large datasets with Deep Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…Many instance selection algorithms have been proposed and reviewed in [59]. Albelwi and Mahmood [60] evaluated and analyzed the performance of different instance selection algorithms on CNNs. In this framework, for very large datasets, we employ instance selection based on Random Mutual Hill Climbing (RMHC) [61] as a preprocessing step to select the training sample (T S ) which will be used during the exploration phase to find a high-performing architecture.…”
Section: Reducing the Training Setmentioning
confidence: 99%
“…Many instance selection algorithms have been proposed and reviewed in [59]. Albelwi and Mahmood [60] evaluated and analyzed the performance of different instance selection algorithms on CNNs. In this framework, for very large datasets, we employ instance selection based on Random Mutual Hill Climbing (RMHC) [61] as a preprocessing step to select the training sample (T S ) which will be used during the exploration phase to find a high-performing architecture.…”
Section: Reducing the Training Setmentioning
confidence: 99%
“…Not only must a hyperparameter optimization algorithm optimize over variables which are discrete, ordinal, and continuous, but it must simultaneously choose which variables to optimize -a di cult task. Currently, no work covers optimization of every hyperparameter in designing a CNN architecture [1]. We briefly refer in the following to some recent CNN hyperparameter optimization results.…”
Section: Related Work: Methods For Cnn Hyperparameter Optimizationmentioning
confidence: 99%
“…The NMA was used in Convolutional Neural Network (CNN) optimization in [1,2], in conjunction with a relatively small optimization dataset. It works well for objective functions that are smooth, unimodal and not too noisy.…”
Section: Derivative-free Optimization: Nelder-meadmentioning
confidence: 99%
“…This may also reduce the search space (relates to A2). For instance, superior results were obtained by combining accuracy with visualization via a deconvolution network [1].…”
Section: Computational Complexity Issuesmentioning
confidence: 99%
See 1 more Smart Citation