2016
DOI: 10.1145/2854157
|View full text |Cite
|
Sign up to set email alerts
|

Robust Decentralized Low-Rank Matrix Decomposition

Abstract: Low-rank matrix approximation is an important tool in data mining with a wide range of applications including recommender systems, clustering, and identifying topics in documents. When the matrix to be approximated originates from a large distributed system-such as a network of mobile phones or smart meters-this is a very challenging problem due to the strongly conflicting, yet essential requirements of efficiency, robustness, and privacy preservation. We argue that while collecting sensitive data in a central… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
22
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
2
2
2

Relationship

2
4

Authors

Journals

citations
Cited by 21 publications
(22 citation statements)
references
References 44 publications
0
22
0
Order By: Relevance
“…Our targeted application environment consists of a potentially large set of personal devices holding private data. We follow the approach in our previous paper [9] and we will adapt the same approach to federated learning. We shall assume that each row in matrix A is stored on exactly one device.…”
Section: Optimization Approachmentioning
confidence: 99%
See 3 more Smart Citations
“…Our targeted application environment consists of a potentially large set of personal devices holding private data. We follow the approach in our previous paper [9] and we will adapt the same approach to federated learning. We shall assume that each row in matrix A is stored on exactly one device.…”
Section: Optimization Approachmentioning
confidence: 99%
“…For this reason, the local update calculation is more thorough, and compression techniques can be applied when uploading the updates to the server. Gossip learning has also been proposed to address the same challenge [9,14]. This approach is fully decentralized, no parameter server is necessary.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Here, the model that is being fit on the data performs a uniform random walk over the network and it is updated before each step using the local data. Recently, the same idea has been applied to matrix factorization as well that is useful, for example, in implementing decentralized recommender systems [2].…”
mentioning
confidence: 99%