2020
DOI: 10.48550/arxiv.2006.06320
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hypernetwork-Based Augmentation

Chih-Yang Chen,
Che-Han Chang

Abstract: Data augmentation is an effective technique to improve the generalization of deep neural networks. Recently, AutoAugment [1] proposed a well-designed search space and a search algorithm that automatically finds augmentation policies in a data-driven manner. However, AutoAugment is computationally intensive. In this paper, we propose an efficient gradient-based search algorithm, called Hypernetwork-Based Augmentation (HBA), which simultaneously learns model parameters and augmentation hyperparameters in a singl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…AutoML has focused on achieving better results via automatic model selection [17,20] including neural architecture search (NAS) [74,19,42]. Other important AutoML topics include hyper-parameter selection [37,2] and data augmentation [14,39,12,5], which are closer to our settings of optimizing the dataset weights. Since the main signal for a model's performance is the final validation loss, which requires full optimization of the model for each evaluation, AutoML approaches often incur a steep computational costs.…”
Section: Related Workmentioning
confidence: 99%
“…AutoML has focused on achieving better results via automatic model selection [17,20] including neural architecture search (NAS) [74,19,42]. Other important AutoML topics include hyper-parameter selection [37,2] and data augmentation [14,39,12,5], which are closer to our settings of optimizing the dataset weights. Since the main signal for a model's performance is the final validation loss, which requires full optimization of the model for each evaluation, AutoML approaches often incur a steep computational costs.…”
Section: Related Workmentioning
confidence: 99%