2021
DOI: 10.1007/s00521-021-06309-8
|View full text |Cite
|
Sign up to set email alerts
|

DEGAS: differentiable efficient generator search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…Finally, in Table 7, we compare our method with the state-of-the-art unsupervised GANs. We should point out that these methods use different generators, as a result of network architecture search (Gong et al, 2019;Doveh & Giryes, 2019). Despite the non optimality of our generator architecture, our approach yields a very competitive FID.…”
Section: Methodsmentioning
confidence: 98%
“…Finally, in Table 7, we compare our method with the state-of-the-art unsupervised GANs. We should point out that these methods use different generators, as a result of network architecture search (Gong et al, 2019;Doveh & Giryes, 2019). Despite the non optimality of our generator architecture, our approach yields a very competitive FID.…”
Section: Methodsmentioning
confidence: 98%
“…Other approaches use reconstruction loss to optimize the generator without a discriminator [5,26]. Other approaches used architecture search to find the generator architecture [19,14,54,55].…”
Section: Related Workmentioning
confidence: 99%
“…The Progressive Growing GAN [21] was created based on [1,16] with the main idea of progressively adding new layers of higher resolution during training, which helps to create highly realistic images. [14,12,40] developed neural architecture search methods to find an optimal neural network architecture to train GAN for a particular task.…”
Section: Related Workmentioning
confidence: 99%