2021
DOI: 10.48550/arxiv.2105.14328
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Transfer Learning under High-dimensional Generalized Linear Models

Abstract: In this work, we study the transfer learning problem under high-dimensional generalized linear models (GLMs), which aim to improve the fit on target data by borrowing information from useful source data. Given which sources to transfer, we propose an oracle algorithm and derive its 2 -estimation error bounds. The theoretical analysis shows that under certain conditions, when the target and source are sufficiently close to each other, the estimation error bound could be improved over that of the classical penal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
14
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(14 citation statements)
references
References 22 publications
0
14
0
Order By: Relevance
“…In contrast to existing transfer learning methods based on GLM, the above procedure has benefits in estimation accuracy and flexibility to be implemented in the federated setting. Compared to a recent work (Tian and Feng, 2021), the above procedure has a faster convergence rate, which is in fact minimax optimal under mild conditions. Moreover, our method learns w (k) independently in Step 1 and Step 2, while in other related methods (Tian and Feng, 2021;Li et al, 2020a), a pooled analysis is conducted with data from multiple populations.…”
Section: The Proposed Algorithmmentioning
confidence: 81%
See 3 more Smart Citations
“…In contrast to existing transfer learning methods based on GLM, the above procedure has benefits in estimation accuracy and flexibility to be implemented in the federated setting. Compared to a recent work (Tian and Feng, 2021), the above procedure has a faster convergence rate, which is in fact minimax optimal under mild conditions. Moreover, our method learns w (k) independently in Step 1 and Step 2, while in other related methods (Tian and Feng, 2021;Li et al, 2020a), a pooled analysis is conducted with data from multiple populations.…”
Section: The Proposed Algorithmmentioning
confidence: 81%
“…Compared to a recent work (Tian and Feng, 2021), the above procedure has a faster convergence rate, which is in fact minimax optimal under mild conditions. Moreover, our method learns w (k) independently in Step 1 and Step 2, while in other related methods (Tian and Feng, 2021;Li et al, 2020a), a pooled analysis is conducted with data from multiple populations. In a federated setting, finding a proper initialization is challenging for such a pooled estimator due to various levels of heterogeneity.…”
Section: The Proposed Algorithmmentioning
confidence: 81%
See 2 more Smart Citations
“…Another active area of integrative data analysis is transfer learning, which extracts knowledge from source task(s) to help better solve a different but relevant target task. Results of transfer learning are established for linear models (Yang et al, 2020;Li et al, 2020a;Tripuraneni et al, 2021), sparse (generalized) linear models (Bastani, 2021;Lei et al, 2021;Tian and Feng, 2021) and nonparametric models (Kpotufe and Martinet, 2021;Cai and Wei, 2021;Reeve et al, 2021). The above list is far from exhaustive.…”
Section: Related Workmentioning
confidence: 99%