Proceedings of the Web Conference 2021 2021
DOI: 10.1145/3442381.3449832
|View full text |Cite
|
Sign up to set email alerts
|

Communication Efficient Federated Generalized Tensor Factorization for Collaborative Health Data Analytics

Abstract: Modern healthcare systems knitted by a web of entities (e.g., hospitals, clinics, pharmacy companies) are collecting a huge volume of healthcare data from a large number of individuals with various medical procedures, medications, diagnosis, and lab tests. To extract meaningful medical concepts (i.e., phenotypes) from such higher-arity relational healthcare data, tensor factorization has been proven to be an effective approach and received increasing research attention, due to their intrinsic capability to rep… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(11 citation statements)
references
References 32 publications
0
11
0
Order By: Relevance
“…Liu et al [21] adapt trained models with similar data distributions to achieve better personalization results for FL. As for applications, researchers utilize FL to help query suggestion [50], keyboard prediction [11], health data analytics [22], human activity recognition [16], and user modeling [43]. As for empirical studies, Yang et al [48] conduct the first empirical study to characterize the impacts of heterogeneity in FL.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Liu et al [21] adapt trained models with similar data distributions to achieve better personalization results for FL. As for applications, researchers utilize FL to help query suggestion [50], keyboard prediction [11], health data analytics [22], human activity recognition [16], and user modeling [43]. As for empirical studies, Yang et al [48] conduct the first empirical study to characterize the impacts of heterogeneity in FL.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, the emerging popular Federated Learning (FL) [25] mitigates some of those concerns, where data are kept locally and local confidentiality issues are addressed [32]. Although FL has drawn numerous attention from researchers and developers [16,21,22,43,48,51], it uses central custodians to keep model parameters, which could still be attacked to infer users' identities and interests [3,27,33,35,38], even with privacy-preserving Deep Learning (DL) techniques. Besides, the star-shaped architecture of FL damages fault tolerance.…”
Section: Introductionmentioning
confidence: 99%
“…Their algorithms also only used compression in the uplink direction. The work in [33] extended the analysis in [19] to cover the multi-block case. However, [33] still deals with unconstrained optimization; the compression was also only for Algorithm 1: CuteMaxVar // At the nodes…”
Section: Convergence Analysismentioning
confidence: 99%
“…First, observe that the update rule in (33) implies that there exists a Λ (r+1) and λ (r+1) such that the following optimality condition holds…”
Section: Supplementary Materials Of "Communication-efficient Distribu...mentioning
confidence: 99%
“…FedAvg calculates the average weights of the models of all users and shares the weights with each user in the FL system [18]. For instance, Ma et al [19] devised a communicationefficient federated generalized tensor factorization for electronic health records. Liu et al [20] used a federated adaptation framework to leverage the sparsity property of neural networks for generating privacy-preserving representations.…”
Section: Introductionmentioning
confidence: 99%