2022
DOI: 10.3390/app12146842
|View full text |Cite
|
Sign up to set email alerts
|

Combating Label Noise in Image Data Using MultiNET Flexible Confident Learning

Abstract: Deep neural networks (DNNs) have been used successfully for many image classification problems. One of the most important factors that determines the final efficiency of a DNN is the correct construction of the training set. Erroneously labeled training images can degrade the final accuracy and additionally lead to unpredictable model behavior, reducing reliability. In this paper, we propose MultiNET, a novel method for the automatic detection of noisy labels within image datasets. MultiNET is an adaptation of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…To train the DNN, we used the PyTorch Python library with a Stochastic Gradient Descent (SGD) optimization [ 49 ] and a cross-entropy loss function. These are frequently used methods utilized in similar image classification tasks [ 50 ]. As a training set, we selected images fro m2 (random) of the 10–15 trials.…”
Section: Preliminary Resultsmentioning
confidence: 99%
“…To train the DNN, we used the PyTorch Python library with a Stochastic Gradient Descent (SGD) optimization [ 49 ] and a cross-entropy loss function. These are frequently used methods utilized in similar image classification tasks [ 50 ]. As a training set, we selected images fro m2 (random) of the 10–15 trials.…”
Section: Preliminary Resultsmentioning
confidence: 99%