2021
DOI: 10.48550/arxiv.2108.11900
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Self-supervised Multi-scale Consistency for Weakly Supervised Segmentation Learning

Abstract: Collecting large-scale medical datasets with fine-grained annotations is time-consuming and requires experts. For this reason, weakly supervised learning aims at optimising machine learning models using weaker forms of annotations, such as scribbles, which are easier and faster to collect. Unfortunately, training with weak labels is challenging and needs regularisation. Herein, we introduce a novel self-supervised multiscale consistency loss, which, coupled with an attention mechanism, encourages the segmentor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 29 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?