2020
DOI: 10.1007/978-3-030-55789-8_61
|View full text |Cite
|
Sign up to set email alerts
|

Constrained Evolutionary Piecemeal Training to Design Convolutional Neural Networks

Abstract: Neural Architecture Search (NAS), which automates the discovery of efficient neural networks, has demonstrated substantial potential in achieving state of the art performance in a variety of domains such as image classification and language understanding. In most NAS techniques, training of a neural network is considered a separate task or a performance estimation strategy to perform the architecture search. We demonstrate that network architecture and its coefficients can be learned together by unifying conce… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

4
4

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 23 publications
0
14
0
Order By: Relevance
“…determined by its associated scenario. The derivation process builds upon an existing evolutionary NAS methodology [42], which searches for the best CNN in terms of a high accuracy only. We extend this NAS algorithm to focus on multiple objectives, namely the ATME characteristics, to arrive at the pareto front, which is a set of CNNs with pareto optimality w.r.t.…”
Section: Scenarios Derivationmentioning
confidence: 99%
See 1 more Smart Citation
“…determined by its associated scenario. The derivation process builds upon an existing evolutionary NAS methodology [42], which searches for the best CNN in terms of a high accuracy only. We extend this NAS algorithm to focus on multiple objectives, namely the ATME characteristics, to arrive at the pareto front, which is a set of CNNs with pareto optimality w.r.t.…”
Section: Scenarios Derivationmentioning
confidence: 99%
“…Partial training refers to training for a short interval or using a subset of the total dataset. The partial training techniques allows a CNN architecture to be searched during the training process itself [42]. Algorithm 1 outlines the complete approach.…”
Section: Throughput and Energymentioning
confidence: 99%
“…All algorithms implemented in the framework have been chosen to consider either the function preservation or minimal loss possible. Some of them were motivated by the genetic operators in evolutionary neural architecture search [20], where function preservation is crucial. The small loss of performance with a major update is expected to be gained back by training more and more as new data keep arriving around the clock.…”
Section: B Architecture Updatementioning
confidence: 99%
“…In this direction, the chief contribution of this work is a novel algorithm for NAS, called Evolutionary Piecemeal Training. This paper is an extension of our earlier work presented in [41], and the methodology has been appended with the ability to define multiple objectives for the search process in this paper.…”
Section: Introductionmentioning
confidence: 99%