2022
DOI: 10.1109/access.2022.3171741
|View full text |Cite
|
Sign up to set email alerts
|

Fisher Task Distance and its Application in Neural Architecture Search

Abstract: We formulate an asymmetric (or non-commutative) distance between tasks based on Fisher Information Matrices, called Fisher task distance. This distance represents the complexity of transferring the knowledge of one task to another. We provide a proof of consistency for our distance through theorems and experiments on various classification tasks from MNIST, CIFAR-10, CIFAR-100, ImageNet, and Taskonomy datasets. Next, we construct an online neural architecture search framework using the Fisher task distance, in… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 41 publications
0
2
0
Order By: Relevance
“…As described above, the CF-LSTM is designed to predict the rating based on a fixed number of hypotheses (e.g., 2) in the dataset. However, CF-LSTM can handle the increase or decrease in the number of hypotheses using transfer learning [23,25,22,24,26]. In other words, we can transfer the prior knowledge from the pre-trained CF-LSTM to a new causal NLP task using transfer learning techniques [23,22].…”
Section: Transfer Learning With Cf-lstmmentioning
confidence: 99%
See 1 more Smart Citation
“…As described above, the CF-LSTM is designed to predict the rating based on a fixed number of hypotheses (e.g., 2) in the dataset. However, CF-LSTM can handle the increase or decrease in the number of hypotheses using transfer learning [23,25,22,24,26]. In other words, we can transfer the prior knowledge from the pre-trained CF-LSTM to a new causal NLP task using transfer learning techniques [23,22].…”
Section: Transfer Learning With Cf-lstmmentioning
confidence: 99%
“…However, CF-LSTM can handle the increase or decrease in the number of hypotheses using transfer learning [23,25,22,24,26]. In other words, we can transfer the prior knowledge from the pre-trained CF-LSTM to a new causal NLP task using transfer learning techniques [23,22]. For example, the new data samples arrive with added treatment types.…”
Section: Transfer Learning With Cf-lstmmentioning
confidence: 99%