2021
DOI: 10.1109/tpami.2019.2960224
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised Semantic Segmentation With High- and Low-Level Consistency

Abstract: The ability to understand visual information from limited labeled data is an important aspect of machine learning. While image-level classification has been extensively studied in a semi-supervised setting, dense pixel-level classification with limited data has only drawn attention recently. In this work, we propose an approach for semi-supervised semantic segmentation that learns from limited pixel-wise annotated samples while exploiting additional annotation-free images. It uses two network branches that lin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
219
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 327 publications
(221 citation statements)
references
References 28 publications
2
219
0
Order By: Relevance
“…After FCN [18] and U-net [19] created the basic semantic segmentation network architecture under supervised learning, researchers have also strived to leverage weakly supervision instead such as multiple instance learning [20], EM algorithm [21] and constrained CNN [22], or semisupervision by additionally using a few pixel-wise segmentation labels [23], [24]. Similar to this paper, some weakly supervised methods [25], [26] used attention masks and classification tags. They achieved an excellent level of semantic segmentation.…”
Section: Weakly Supervised Semantic Segmentationmentioning
confidence: 99%
“…After FCN [18] and U-net [19] created the basic semantic segmentation network architecture under supervised learning, researchers have also strived to leverage weakly supervision instead such as multiple instance learning [20], EM algorithm [21] and constrained CNN [22], or semisupervision by additionally using a few pixel-wise segmentation labels [23], [24]. Similar to this paper, some weakly supervised methods [25], [26] used attention masks and classification tags. They achieved an excellent level of semantic segmentation.…”
Section: Weakly Supervised Semantic Segmentationmentioning
confidence: 99%
“…Only four recent works consider true semisupervised learning for semantic segmentation, which uses a large number of unlabeled samples and a small number of fully labeled samples to train the network. Three works [14], [22], [28] use Generative Adversarial Networks (GAN) as a backbone network, but have different training manners. Souly et al [28] utilizes GAN to generate additional data for improving the feature extraction capabilities of the segmentation network.…”
Section: Semi-supervised Semantic Segmentationmentioning
confidence: 99%
“…Souly et al [28] utilizes GAN to generate additional data for improving the feature extraction capabilities of the segmentation network. Hung et al [14] and Mittal et al [22] proposed a GAN-based segmentation network that can learn beneficial feature information from unlabeled data to assist in the training of labeled data. On the other hand, Bellver et al [3] uses the aforementioned self-learning framework for semantic segmentation, where the prediction results of unlabeled images are accepted as ground truth labels.…”
Section: Semi-supervised Semantic Segmentationmentioning
confidence: 99%
See 1 more Smart Citation
“…Semisupervised segmentation methods are becoming increasingly prevalent in the computer vision domain, which attempt to reduce the annotation effort by utilizing a large set of unlabeled data for model learning. Most of those methods 35ā€37 employ adversarial loss for semisupervised learning on unlabeled data. Inspired by them, in this article, our proposed SemiTongue attempts to conduct segmentation in the semisupervised setting by utilizing a large number of unlabeled data, while imposes a selfā€reconstruction constraint between labeled and unlabeled data to improve the accuracy of tongue segmentation.…”
Section: Related Workmentioning
confidence: 99%