2021
DOI: 10.48550/arxiv.2102.07767
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Communication-efficient Distributed Cooperative Learning with Compressed Beliefs

Abstract: We study the problem of distributed cooperative learning, where a group of agents seek to agree on a set of hypotheses that best describes a sequence of private observations. In the scenario where the set of hypotheses is large, we propose a belief update rule where agents share compressed (either sparse or quantized) beliefs with an arbitrary positive compression rate. Our algorithm leverages a unified and straightforward communication rule that enables agents to access wide-ranging compression operators as b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 15 publications
0
8
0
Order By: Relevance
“…Check [26,Table 1] for the number of bits required for each operator. Details of other operators can be found in [31], [32].…”
Section: Problem Setup Algorithm and Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Check [26,Table 1] for the number of bits required for each operator. Details of other operators can be found in [31], [32].…”
Section: Problem Setup Algorithm and Resultsmentioning
confidence: 99%
“…We first consider an average consensus problem with d = 300 parameters over a directed Ring graphs with different number of agents n ∈ {20, 50, 100, 200, 500}. We consider a grid over (0, 1] for compression ratio ω, thus top 100ω% [26] as the proper compression operator. We quantify the number of round required for each pair (n, ω), to reach an ε-accuracy where ε=1 • 10 −5 .…”
Section: A Consensusmentioning
confidence: 99%
See 1 more Smart Citation
“…Hereafter, we drop ζ, ω from Q and E for simplicity of notation. The class of randomized operators introduced in (3) embraces a wide range of functions, both sparsification, and quantization, some of which we mention in Example 1 [24].…”
Section: Problem Setup Algorithm and Resultsmentioning
confidence: 99%
“…Aiming to improve communication efficiency in detection networks, the work [19] proposes communicating with one randomly sampled agent instead of all neighbors at each time instant. Moreover, quantizing the beliefs is possible and studied in [20], [21]. In our work, decreasing the communication burden on the nodes is instead achieved by transmitting partial beliefs.…”
Section: Introduction and Related Workmentioning
confidence: 99%