2019 IEEE International Conference on Big Data (Big Data) 2019
DOI: 10.1109/bigdata47090.2019.9006060
|View full text |Cite
|
Sign up to set email alerts
|

Online Federated Multitask Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(14 citation statements)
references
References 5 publications
0
14
0
Order By: Relevance
“…Second, the framework reschedules the training of clients to solve the local imbalance. In order to deal with a scenario where new devices keep joining the system, Li et al [22] introduce an online FL approach that derives model parameters for new devices based on the local data and existing model without revisiting the data of existing devices. The approach shows a comparable accuracy to conventional algorithms with smaller computation, communication and storage costs.…”
Section: A Learning Efficiency Of Flmentioning
confidence: 99%
“…Second, the framework reschedules the training of clients to solve the local imbalance. In order to deal with a scenario where new devices keep joining the system, Li et al [22] introduce an online FL approach that derives model parameters for new devices based on the local data and existing model without revisiting the data of existing devices. The approach shows a comparable accuracy to conventional algorithms with smaller computation, communication and storage costs.…”
Section: A Learning Efficiency Of Flmentioning
confidence: 99%
“…However, upon having extreme data heterogeneity across the devices, training a single model may result in poor model performance for a portion of devices. This has motivated a new trend of research in ML that aims to train userspecific ML models, referred to as multi-task learning [34], [35] or personalized federated learning [36], in the latter of which meta-gradient updates are introduced to enhance the training efficiency. As compared to this literature, we propose and investigate a new distributed ML paradigm over wireless networks called hierarchical nested personalized federated learning (HN-PFL) inspired by meta-gradient updates.…”
Section: Related Workmentioning
confidence: 99%
“…Constraint (34) imposed on the leader UAVs guarantees that there is enough energy remaining after parameter broadcasting and flying to reach the nearest recharging station. Constraints (35) and (36) ensure that the total amount of data offloaded from each device i ∈ C(u, s) and each coordinator UAV h ∈ W u is less than the size of the available data set. As a result of this offloading, (37) and (38) capture the total number of datapoints at the UAVs.…”
Section: Joint Energy and Performance Optimizationmentioning
confidence: 99%
“…In [1,12], different multitask extensions of online learning are investigated, see also [3,17,6,13] for related extensions to meta-learning. Some online multitask applications are studied in [32,27,28], but without providing any regret analyses. In [33,40] authors extend the results of [10] to dynamically updated interaction matrices.…”
Section: Related Workmentioning
confidence: 99%
“…with L ∈ R N ×d and µ ∈ R N the Lagrange multipliers associated to constraints ( 29) and (28) respectively. For all i ≤ N , j ≤ d, differentiating with respect to Y ij and setting the gradient to 0 yields:…”
Section: A Technical Proofsmentioning
confidence: 99%