2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00148
|View full text |Cite
|
Sign up to set email alerts
|

Transferability and Hardness of Supervised Classification Tasks

Abstract: We propose a novel approach for estimating the difficulty and transferability of supervised classification tasks. Unlike previous work, our approach is solution agnostic and does not require or assume trained models. Instead, we estimate these values using an information theoretic approach: treating training labels as random variables and exploring their statistics. When transferring from a source to a target task, we consider the conditional entropy between two such variables (i.e., label assignments of the t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
53
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 95 publications
(55 citation statements)
references
References 53 publications
2
53
0
Order By: Relevance
“…On a different note, there have been few proposals of viable metrics for evaluating similarity between real datasets, with the goal of obtaining predictors of the associated transfer learning performance. Some of these architecture agnostic metrics evaluate dataset distances based on information theory, information geometry and optimal transport [53][54][55][56][57]. An interesting direction for future work would be to connect these distance metrics with the parametric transformations in the CHMM.…”
Section: Connection With Related Workmentioning
confidence: 99%
“…On a different note, there have been few proposals of viable metrics for evaluating similarity between real datasets, with the goal of obtaining predictors of the associated transfer learning performance. Some of these architecture agnostic metrics evaluate dataset distances based on information theory, information geometry and optimal transport [53][54][55][56][57]. An interesting direction for future work would be to connect these distance metrics with the parametric transformations in the CHMM.…”
Section: Connection With Related Workmentioning
confidence: 99%
“…The log expectation of the empirical predictor (LEEP) is used as a transferability measure. The LEEP method is closely related to Negative Conditional Entropy (NCE) proposed by Tran et al (2019), an informationtheoretic quantity (Cover, 1999) to study the transferability and hardness between classification tasks.…”
Section: Assessing Transferablitiy Of Pre-trained Modelsmentioning
confidence: 99%
“…Despite its practical significance, there is limited guidance on task adaptive pre-trained model selection. Based on NCE (Tran et al, 2019), Nguyen et al (2020) recently studied the problem when both the pre-train task and the downstream task are classification. They construct an empirical predictor by estimating the joint distribution over the pretrained and target label spaces and take the performance of the empirical predictor (LEEP) to assess pre-trained models.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, NCE [21] and LEEP [8] measure task transferability by estimating an approximation of conditional entropy, H(y j |y i ) = H(y j ) − I(y i ; y j ). Ignoring H(y j ) and focusing on the mutual information term, we can understand these two methods as measuring the dependency between two target spaces Y i and Y j .…”
Section: A Taxonomy Of Task Similaritiesmentioning
confidence: 99%