2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA) 2018
DOI: 10.1109/icmla.2018.00078
|View full text |Cite
|
Sign up to set email alerts
|

MedAL: Accurate and Robust Deep Active Learning for Medical Image Analysis

Abstract: Active Learning methods create an optimized and labeled training set from unlabeled data. We introduce a novel Online Active Deep Learning method for Medical Image Analysis. We extend our MedAL active learning framework to present new results in this paper. Experiments on three medical image datasets show that our novel online active learning model requires significantly less labelings, is more accurate, and is more robust to class imbalances than existing methods. Our method is also more accurate and computat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
38
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 59 publications
(38 citation statements)
references
References 31 publications
0
38
0
Order By: Relevance
“…Finally, we train the model on the new data and repeat the sampling process over again. The original MedAL paper (Smailagic et al, 2018) evaluated a variety of distance functions and found empirically that Euclidean distance and cosine distance yielded the highest and second highest entropy, respectively, when F I G U R E 1 Proposed active learning pipeline. To solve a supervised classification task, we will use a deep network (DN), an initial labeled dataset D train , an unlabeled dataset D oracle , and an oracle who can label data.…”
Section: Sampling Based On Distance Between Feature Embeddingsmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, we train the model on the new data and repeat the sampling process over again. The original MedAL paper (Smailagic et al, 2018) evaluated a variety of distance functions and found empirically that Euclidean distance and cosine distance yielded the highest and second highest entropy, respectively, when F I G U R E 1 Proposed active learning pipeline. To solve a supervised classification task, we will use a deep network (DN), an initial labeled dataset D train , an unlabeled dataset D oracle , and an oracle who can label data.…”
Section: Sampling Based On Distance Between Feature Embeddingsmentioning
confidence: 99%
“…To configure the AL sampler, we use the Euclidean distance and obtain feature embeddings from the Mixed5 layer of Inception V3. These choices are a result of our prior empirical evidence that this combination of layer and distance function achieves the highest entropy (Smailagic et al, 2018).…”
Section: E D Al Implementationmentioning
confidence: 99%
See 1 more Smart Citation
“…By constructing neural networks with deep and special architectures, we can approximate a wide range of highly non-linear and complex mapping functions [59]. A variety of deep neural networks have been proved to be effective and powerful in many real-world tasks, including computer vision [60], natural language processing [61], medical imaging [62,63] and video game [64]. Some typical architectures of deep neural networks include deep convolutional neural network (CNN) [60] and recurrent neural network [65].…”
Section: Architectures Of Feature Extractor Domain Discriminator Anmentioning
confidence: 99%
“…This technique can be conceptualised by an analogy to a diligent student, who while taking a course actively asks the teacher for more examples on topics which are hard for them to understand. There are several attempts at active learning for histopathological image classification in the literature, using both traditional machine learning 37,[39][40][41][42] and deep learning [43][44][45][46] . However, none of these approaches utilised uncertainty for selection of new samples in active training.…”
Section: Introductionmentioning
confidence: 99%