2020
DOI: 10.48550/arxiv.2007.07936
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ClassMix: Segmentation-Based Data Augmentation for Semi-Supervised Learning

Abstract: The state of the art in semantic segmentation is steadily increasing in performance, resulting in more precise and reliable segmentations in many different applications. However, progress is limited by the cost of generating labels for training, which sometimes requires hours of manual labor for a single image. Because of this, semi-supervised methods have been applied to this task, with varying degrees of success. A key challenge is that common augmentations used in semi-supervised classification are less eff… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
41
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(41 citation statements)
references
References 17 publications
0
41
0
Order By: Relevance
“…19 urban-scene semantic classes are defined in Cityscapes for semantic segmentation. Similar to previous standards [26,42,19,44,18,41] in semi-supervised semantic segmentation, we randomly sample 1/8 and 1/4 training images to construct the labeled set, and the remaining training images consist of the unlabeled set. To further explore the effectiveness of the proposed method, we also conduct experiments on the PASCAL VOC 2012 dataset (VOC12) [15], which provides 20 semantic classes and 1 background class.…”
Section: Methodsmentioning
confidence: 99%
“…19 urban-scene semantic classes are defined in Cityscapes for semantic segmentation. Similar to previous standards [26,42,19,44,18,41] in semi-supervised semantic segmentation, we randomly sample 1/8 and 1/4 training images to construct the labeled set, and the remaining training images consist of the unlabeled set. To further explore the effectiveness of the proposed method, we also conduct experiments on the PASCAL VOC 2012 dataset (VOC12) [15], which provides 20 semantic classes and 1 background class.…”
Section: Methodsmentioning
confidence: 99%
“…Recently, many attempts try to regularize the learner to form predictions invariant towards perturbations [34,45]. Along this line of thought, the data augmentation strategy Cut-Mix [65] can serve as perturbation by mixing predictions and assembling new informative pseudo-labels [19,44]. This approach is often accompanied by mean-teachers [58] serving as robust models for distilling invariance into the student.…”
Section: Related Workmentioning
confidence: 99%
“…Mean-teachers (i.e. the exponential moving average over previous model parameters) were successfully introduced for semi-supervised classification [58] and saw some use in segmentation [19,44,48]. The idea is that better predictions can be obtained by maintaining a teacher model via continuously updating its parameters with the moving average of the student model and the previous teacher model.…”
Section: Multi-label Deeply Supervised Netsmentioning
confidence: 99%
See 1 more Smart Citation
“…While consistency regularization has been successfully employed for classification tasks, applying traditional data augmentation techniques to semantic segmentation has been shown [5] to be less effective as semantic segmentation may not exhibit low density regions around class boundaries. Several approaches have been developed to address this issue by applying augmentation on encoded space instead of input space [6], or by enforcing consistent predictions for unsupervised mixed samples as in CutMix [7,5], CowMix [8], and ClassMix [9].…”
Section: Introductionmentioning
confidence: 99%