2020
DOI: 10.48550/arxiv.2011.11927
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Acceleration of Cooperative Least Mean Square via Chebyshev Periodical Successive Over-Relaxation

Tadashi Wadayama,
Satoshi Takabe

Abstract: A distributed algorithm for least mean square (LMS) can be used in distributed signal estimation and in distributed training for multivariate regression models. The convergence speed of an algorithm is a critical factor because a faster algorithm requires less communications overhead and it results in a narrower network bandwidth. The goal of this paper is to present that use of Chebyshev periodical successive overrelaxation (PSOR) can accelerate distributed LMS algorithms in a naturally manner. The basic idea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 9 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?