Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence 2020
DOI: 10.24963/ijcai.2020/144
|View full text |Cite
|
Sign up to set email alerts
|

CP-NAS: Child-Parent Neural Architecture Search for 1-bit CNNs

Abstract: Neural architecture search (NAS) proves to be among the best approaches for many tasks by generating an application-adaptive neural architectures, which are still challenged by high computational cost and memory consumption. At the same time, 1-bit convolutional neural networks (CNNs) with binarized weights and activations show their potential for resource-limited embedded devices. One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS by taking advantage of the stre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(7 citation statements)
references
References 1 publication
0
7
0
Order By: Relevance
“…For this purpose, the dataset is divided into training and validation sets to be used for training a model and evaluating its performance, respectively. The results of the VSt can be directly fed into SSp and SSt for their modifications [25,26].…”
Section: Basic Knowledgementioning
confidence: 99%
See 1 more Smart Citation
“…For this purpose, the dataset is divided into training and validation sets to be used for training a model and evaluating its performance, respectively. The results of the VSt can be directly fed into SSp and SSt for their modifications [25,26].…”
Section: Basic Knowledgementioning
confidence: 99%
“…As a result, the optimized model needed less memory and fewer numbers of Floating Point Operations (FLOPs). CP-NAS (Child-Parent NAS) [25] and DARTS (differentiable architecture search) [42] developed continuous variables for each connection between layers. The optimizer continuously changes the variables using gradient-based methods to find the best suitable connections by emphasizing the important connections with higher values.…”
Section: Hierarchical-based Search Space (Figure 7b)mentioning
confidence: 99%
“…In binarized neural architecture search (BNAS) [35], neural architecture search is used to search BNNs, and the BNNs obtained by BNAS can outperform conventional models by a large margin. Another natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS by taking advantage of the strengths of each in a unified framework [36]. To accomplish this, a Child-Parent (CP) model is introduced to a differentiable NAS to search the binarized architecture (Child) under the supervision of a full precision model (Parent).…”
Section: F Neural Architecture Searchmentioning
confidence: 99%
“…In our BNAS framework, we show that the BNNs obtained by BNAS can outperform conventional models by a large margin. While BNAS only focuses on the kernel binarization, to achieve 1bit CNNs, our CP-NAS [36] advances this work to binarize both the weights and activations. In CP-NAS, a Child-Parent (CP) model is introduced to a differentiable NAS to search the binarized architecture (Child) under the supervision of a full precision model (Parent).…”
Section: Our Work On Bnnsmentioning
confidence: 99%
“…Following [49,24,46,47,7], we search for computation cells as the building blocks of the final architecture. Different from these approaches, we search for v (v > 2) kinds of cells instead of only normal and reduction cells.…”
Section: Search Spacementioning
confidence: 99%