2019
DOI: 10.48550/arxiv.1912.02037
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

AdversarialNAS: Adversarial Neural Architecture Search for GANs

Abstract: Neural Architecture Search (NAS) that aims to automate the procedure of architecture design has achieved promising results in many computer vision fields. In this paper, we propose an AdversarialNAS method specially tailored for Generative Adversarial Networks (GANs) to search for a superior generative model on the task of unconditional image generation. The proposed method leverages an adversarial searching mechanism to search for the architectures of generator and discriminator simultaneously in a differenti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 40 publications
0
7
0
Order By: Relevance
“…We perform experiments on some popular benchmarks, including the CIFAR10 and CIFAR100 datasets that have different numbers of classes. For unconditional image generation, our search method shows superior performance, comparable to those equipped with the automatic search [13,11,38]. For class-conditional image generation, our approach shows significant gains in terms of important criteria (e.g., FID and intra FIDs).…”
Section: Introductionmentioning
confidence: 94%
See 1 more Smart Citation
“…We perform experiments on some popular benchmarks, including the CIFAR10 and CIFAR100 datasets that have different numbers of classes. For unconditional image generation, our search method shows superior performance, comparable to those equipped with the automatic search [13,11,38]. For class-conditional image generation, our approach shows significant gains in terms of important criteria (e.g., FID and intra FIDs).…”
Section: Introductionmentioning
confidence: 94%
“…There are a number of improvements for the original GAN, e.g., changing the objective function [1,15,27,18,33], improving network architecture [34,3,21,9,42,19], using multiple generators or discriminators [37,17,2,10,12,30]. Recently, the surge in neural architecture search (NAS) has triggered a wave of interest in automatically designing the network architecture of GAN [38,13,11].…”
Section: Related Workmentioning
confidence: 99%
“…Neural Architecture Search. Neural Architecture Search (NAS) has been utilized to search the optimal network architecture for numerous computer vision problems, such as image classification [1,17,28,30,41,49], object detection [11,24,35,42,45], image segmentation [29,53], and image synthesis [10,14]. In general, the NAS algorithms need to solve the problems of architecture search and weight optimization.…”
Section: Related Workmentioning
confidence: 99%
“…The computational cost for AGAN is comparably very expensive (1200 GPU days). In addition, AdversarialNAS [10] and DEGAS [9] adopted a different approach, i.e., differentiable searching strategy [35], for the GAN architecture search problem.…”
Section: Related Workmentioning
confidence: 99%