2022
DOI: 10.1016/j.vehcom.2021.100396
|View full text |Cite
|
Sign up to set email alerts
|

Decentralized federated learning for extended sensing in 6G connected vehicles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 59 publications
(32 citation statements)
references
References 63 publications
0
32
0
Order By: Relevance
“…examples E k are drawn randomly from the full training set E, while µ s is the step-size and techniques such as model pruning, sparsification [24], parameter selection and/or differential transmission schemes [12], [25] are extremely helpful to scale down the footprint b(W) in large DNNs. However, due to the great variety of compression techniques, we have considered here only a simple quantization scheme.…”
Section: Energy Footprint Modeling Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…examples E k are drawn randomly from the full training set E, while µ s is the step-size and techniques such as model pruning, sparsification [24], parameter selection and/or differential transmission schemes [12], [25] are extremely helpful to scale down the footprint b(W) in large DNNs. However, due to the great variety of compression techniques, we have considered here only a simple quantization scheme.…”
Section: Energy Footprint Modeling Frameworkmentioning
confidence: 99%
“…The energy cost for computing E are critical for sustainable designs. Notice that, as opposed to data, b(W) is roughly the same for each device, although small changes might be observed when using lossy compression, sparsification or parameter selection methods [12], [24], [25].…”
Section: B Federated Learning With Server (Fa) and Deep-sleep (Fa-d)mentioning
confidence: 99%
“…In contrast, if a blockchain system is implemented, it leverages Smart Contracts (SC) to coordinate the round delineation, model aggregation, and update tasks in FLS [130,152,184,203,254]. Lastly, if graph-based FLS is implemented, each client will utilize the graph neural network model with its neighbours to formulate the global models [15,84,143,283]. across the clients and the servers.…”
Section: Enabling Technologiesmentioning
confidence: 99%
“…The authors in [42] used FL to predict the turning signal. More recently, the authors in [43], [44] used FL for 6G-enabled autonomous cars. Peng et al [45] introduced an adaptive FL framework for autonomous vehicles.…”
Section: Related Workmentioning
confidence: 99%