2022
DOI: 10.1609/aaai.v36i7.20688
|View full text |Cite
|
Sign up to set email alerts
|

TrustAL: Trustworthy Active Learning Using Knowledge Distillation

Abstract: Active learning can be defined as iterations of data labeling, model training, and data acquisition, until sufficient labels are acquired. A traditional view of data acquisition is that, through iterations, knowledge from human labels and models is implicitly distilled to monotonically increase the accuracy and label consistency. Under this assumption, the most recently trained model is a good surrogate for the current labeled data, from which data acquisition is requested based on uncertainty/diversity. Our c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…To bridge the gap between AL and deep learning models, Kwak et al [186] introduce trustworthy AL (TrustAL), a labelefficient DAL framework by transferring distilled knowledge from deep learning models to the data selection process. As shown in Fig.…”
Section: Challenges and Opportunities Of Dalmentioning
confidence: 99%
“…To bridge the gap between AL and deep learning models, Kwak et al [186] introduce trustworthy AL (TrustAL), a labelefficient DAL framework by transferring distilled knowledge from deep learning models to the data selection process. As shown in Fig.…”
Section: Challenges and Opportunities Of Dalmentioning
confidence: 99%