We prove a two-way correspondence between the min-max optimization of general CPE loss function GANs and the minimization of associated f -divergences. We then focus on α-GAN, defined via the α-loss, which interpolates several GANs (Hellinger, vanilla, Total Variation) and corresponds to the minimization of the Arimoto divergence. We show that the Arimoto divergences induced by α-GAN equivalently converge, for all α ∈ R>0∪{∞}. However, under restricted learning models and finite samples, we provide estimation bounds which indicate diverse GAN behavior as a function of α. Finally, we present empirical results on a toy dataset that highlight the practical utility of tuning the α hyperparameter.
In an effort to address the training instabilities of GANs, we introduce a class of dual-objective GANs with different value functions (objectives) for the generator (G) and discriminator (D). In particular, we model each objective using α-loss, a tunable classification loss, to obtain (αD, αG)-GANs, parameterized by (αD, αG) ∈ [0, ∞) 2 . For sufficiently large number of samples and capacities for G and D, we show that the resulting non-zero sum game simplifies to minimizing an f -divergence under appropriate conditions on (αD, αG). In the finite sample and capacity setting, we define estimation error to quantify the gap in the generator's performance relative to the optimal setting with infinite samples and obtain upper bounds on this error, showing it to be order optimal under certain conditions. Finally, we highlight the value of tuning (αD, αG) in alleviating training instabilities for the synthetic 2D Gaussian mixture ring and the Stacked MNIST datasets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.