2020
DOI: 10.1007/s40435-020-00708-w
|View full text |Cite
|
Sign up to set email alerts
|

Automatic model selection for fully connected neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(14 citation statements)
references
References 29 publications
0
14
0
Order By: Relevance
“…In our work, we chose the CNN algorithm to be the model classifying pre- and post-learning SWRs. Indeed, the CNN performed much better, with a greater than 10% average accuracy rate, than other deep learning algorithms, such as fully connected networks 28 , 29 or recurrent neural networks( 30 – 32 ).…”
Section: Methodsmentioning
confidence: 99%
“…In our work, we chose the CNN algorithm to be the model classifying pre- and post-learning SWRs. Indeed, the CNN performed much better, with a greater than 10% average accuracy rate, than other deep learning algorithms, such as fully connected networks 28 , 29 or recurrent neural networks( 30 – 32 ).…”
Section: Methodsmentioning
confidence: 99%
“…For both fixed and nonfixed topological patterns, the architectural units can be macrounits, which are more commonly known as cells in existing works. In cell-based solution representations [15], [53], [64]- [67], [67]- [69], [71], [79], [159], the encodings of connections among cells follow the ways described in the parts of fixed and nonfixed topological patterns. The cell itself can be treated as a "small" DNN model, which can also be represented in the way following either fixed or nonfixed topological patterns.…”
Section: B Taxonomy and Survey Of Existing Ea-based Approachesmentioning
confidence: 99%
“…For example, in [131], ES is applied to select the learner from Adam and Adadelta and optimize the parameters of the chosen learner. In some works, optimization of learner's parameters is integrated with the model architecture optimization process [51], [53], [55], [127]- [130], [133], [134], [168], [169]. For example, in [127], the EA is applied to design a VGG model, where the parameters of a prespecified model parameter learner are encoded together with the model architecture for solution representations.…”
Section: B Taxonomy and Surveymentioning
confidence: 99%
“…Neural architecture search (NAS) is a popular solution to deep learning model selection [57,62,63,102], which automatically selects a good neural network architecture for a given learning task. Since an overly complex model may take too-long training time and thus may become a serious obstacle of neural architecture search [57,62], the accuracy-complexity tradeoff is an important consideration in neural architecture search. Liu et al [62] propose Progressive Neural Architecture Search, which searches for convolutional neural network architectures in the increasing order of model complexity.…”
Section: Model Complexity In Model Selection and Designmentioning
confidence: 99%