2020
DOI: 10.1109/twc.2019.2961673
|View full text |Cite
|
Sign up to set email alerts
|

Federated Learning via Over-the-Air Computation

Abstract: The stringent requirements for low-latency and privacy of the emerging high-stake applications with intelligent devices such as drones and smart vehicles make the cloud computing inapplicable in these scenarios. Instead, edge machine learning becomes increasingly attractive for performing training and inference directly at network edges without sending data to a centralized data center. This stimulates a nascent field termed as federated learning for training a machine learning model on computation, storage, e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
650
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 915 publications
(655 citation statements)
references
References 49 publications
4
650
1
Order By: Relevance
“…• At a typical UE k, it solves a local optimization problem (14) using only the data stored on the device. Based on the solution, the UE updates the local reference a t [k] per (18), and if being selected by the AP, it sends out a global update ∆v t k via the allocated subchannel. • At the AP side, it selects a subgroup of UEs for update collection, decodes the received packet, and performs a global aggregation according to (20).…”
Section: Federated Learning In Wireless Networkmentioning
confidence: 99%
“…• At a typical UE k, it solves a local optimization problem (14) using only the data stored on the device. Based on the solution, the UE updates the local reference a t [k] per (18), and if being selected by the AP, it sends out a global update ∆v t k via the allocated subchannel. • At the AP side, it selects a subgroup of UEs for update collection, decodes the received packet, and performs a global aggregation according to (20).…”
Section: Federated Learning In Wireless Networkmentioning
confidence: 99%
“…However, such a relaxation may not be tight, i.e., the solution obtained by SDR may not satisfy the rank-one constraint. As pointed out in [45], [46], the performance of SDR degrades sharply as the problem size grows. In our case, when M and/or K is large, the probability of returning a rank-one solution is low.…”
Section: A Mixed 12 -Norm For Group Sparsity Inducingmentioning
confidence: 91%
“…The proposed BAA can dramatically reduce communication latency compared with traditional Orthogonal Frequency Division Multiple Access (OFDMA). [34] also explores the over-the-air computation for model aggregation in Federated Learning. Concretely, [34] puts the principle into practice by modeling the device selection and beamforming design as a sparse and low-rank optimization problem, which is highly intractably combinatorial.…”
Section: Problem Definition Model Construction Algorithm Designmentioning
confidence: 99%
“…[34] also explores the over-the-air computation for model aggregation in Federated Learning. Concretely, [34] puts the principle into practice by modeling the device selection and beamforming design as a sparse and low-rank optimization problem, which is highly intractably combinatorial. To solve the problem with fast convergence rate, this paper proposed a difference-of-convex-functions (DC) representation via successive convex relaxation.…”
Section: Problem Definition Model Construction Algorithm Designmentioning
confidence: 99%