2020
DOI: 10.1145/3360309
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning with Dynamic Distribution Adaptation

Abstract: Transfer learning aims to learn robust classifiers for the target domain by leveraging knowledge from a source domain. Since the source and the target domains are usually from different distributions, existing methods mainly focus on adapting the cross-domain marginal or conditional distributions. However, in real applications, the marginal and conditional distributions usually have different contributions to the domain discrepancy. Existing methods fail to quantitatively evaluate the different importance of t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
121
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 172 publications
(121 citation statements)
references
References 65 publications
0
121
0
Order By: Relevance
“…Furthermore, the original training dataset provided by the customer may itself be from multiple private sources (e.g., mobile crowdsensing) and may follow some multimodal distribution. The AI marketplace should also be able to help the customer by allowing the aggregation of multiple alternative datasets from other data owners in the marketplace while also ensuring the aggregate dataset follows a similar distribution (e.g., using a transfer learning approach [18]) as the validation dataset. This is important since AI learning algorithms suffer from major model quality loss (or even divergence) when trained on non-IID data [19].…”
Section: Technical Aspects Of Ai Marketplacementioning
confidence: 99%
“…Furthermore, the original training dataset provided by the customer may itself be from multiple private sources (e.g., mobile crowdsensing) and may follow some multimodal distribution. The AI marketplace should also be able to help the customer by allowing the aggregation of multiple alternative datasets from other data owners in the marketplace while also ensuring the aggregate dataset follows a similar distribution (e.g., using a transfer learning approach [18]) as the validation dataset. This is important since AI learning algorithms suffer from major model quality loss (or even divergence) when trained on non-IID data [19].…”
Section: Technical Aspects Of Ai Marketplacementioning
confidence: 99%
“…Joint Distribution Analysis (JDA) [27] improves TCA by considering not only the marginal distribution shift but also the conditional distribution shift with the pseudo-labels of the target domain. Wang et al [53,54] improves JDA by adaptively leveraging the marginal and conditional distributions. Li et al [55][56][57] adopted MMD to eliminate the discrepancy of features and distributions between the source and target domains under the heterogeneous domain adaptation.…”
Section: A Domain Adaptationmentioning
confidence: 99%
“…General feature-based transfer learning methods include Transfer Component Analysis (TCA) [27], Joint Distribution Adaptation (JDA) [28], Balanced boundary Distribution Adaptation (BDA) [29], etc. These methods are types of unsupervised transfer learning, where there is no label information in the target domain.…”
Section: Transfer Learning In the Wireless Fingerprinting Localizationmentioning
confidence: 99%
“…The results can be constructed with dimension reduction method, that is, solve the first m eigenvalues of (KLK + µI) −1 KHK, reducing the computational cost of solving SDP. The TCA method is the base method in feature-based transfer learning, many other methods are extended upon it, such as JDA [28] and BDA [29].…”
Section: Transfer Component Analysismentioning
confidence: 99%