2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) 2019
DOI: 10.1109/iccvw.2019.00058
|View full text |Cite
|
Sign up to set email alerts
|

Photometric Transformer Networks and Label Adjustment for Breast Density Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 11 publications
0
7
0
Order By: Relevance
“…Learning from Noisy Labels: Neural network anti-noise training has made great progress. Current research are mainly focused on: estimating the noise transition matrix [7,[35][36][37][38][39][40], designing a robust loss function [41][42][43][44], correcting the noise label [45][46][47][48][49][50], sampling importance weighting [51][52][53][54][55] and meta-learning [56][57][58][59]. Our work is in the category of estimating the noise transition matrix.…”
Section: Related Workmentioning
confidence: 99%
“…Learning from Noisy Labels: Neural network anti-noise training has made great progress. Current research are mainly focused on: estimating the noise transition matrix [7,[35][36][37][38][39][40], designing a robust loss function [41][42][43][44], correcting the noise label [45][46][47][48][49][50], sampling importance weighting [51][52][53][54][55] and meta-learning [56][57][58][59]. Our work is in the category of estimating the noise transition matrix.…”
Section: Related Workmentioning
confidence: 99%
“…Several methods have been proposed for the noisy-label problem, and they explore different strategies, such as robust loss functions (Wang et al, 2019a;, label cleansing (Jaehwan et al, 2019;Yuan et al, 2018), sample weighting (Ren et al, 2018), meta-learning (Han et al, 2018a), ensemble learning (Miao et al, 2015), and others (Yu et al, 2018;Kim et al, 2019;Zhang et al, 2019). Below, we focus on the prior work that is close to our approach and that show competitive results on the main benchmarks.…”
Section: Prior Workmentioning
confidence: 99%
“…Recently proposed noisy label learning methods rely on the following strategies: robust loss functions [36,49,51], sample selection [16], label cleansing [21,56], sample weighting [46], meta-learning [15], ensemble learning [38] and semi-supervised learning (SSL) [4]. The most successful approaches use SSL, combined with other methods [22,30,33].…”
Section: Prior Workmentioning
confidence: 99%