2021
DOI: 10.48550/arxiv.2111.06867
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Flatee: Federated Learning Across Trusted Execution Environments

Abstract: Federated learning allows us to distributively train a machine learning model where multiple parties share local model parameters without sharing private data. However, parameter exchange may still leak information. Several approaches have been proposed to overcome this, based on multi-party computation, fully homomorphic encryption, etc.; many of these protocols are slow and impractical for real-world use as they involve a large number of cryptographic operations. In this paper, we propose the use of Trusted … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…For instance, in the context of Human Activity Recognition (HAR) mentioned earlier, a data custodian with highly sensitive information may opt for an exceptionally low privacy budget, while an average user might select a moderately higher privacy budget that balances privacy and utility. We observe that privacy-preserving techniques such as DP-SGD [1] may Frameworks HAR Personalized TEE HE DP PPK Flatee [24] ✓ ✓ Chex-mix [25] ✓ ✓ ✓ Ada-PPFL [16] ✓ PPFL-NonIID [32] ✓ ✓ PFSCL [31] ✓ ✓ ✓ ClusterFL [26] ✓ ✓ LDP-Fed [38] ✓ experience performance degradation as the privacy budget decreases, while alternatives like DP-PATE [27] offer enhanced performance and privacy guarantees at the cost of increased computational complexity. Consequently, we recognize that the choice between these methods should be driven by specific use-case requirements, enabling us to effectively quantify privacy while accommodating diverse and non-uniform privacy preferences.…”
Section: Introductionmentioning
confidence: 88%
See 3 more Smart Citations
“…For instance, in the context of Human Activity Recognition (HAR) mentioned earlier, a data custodian with highly sensitive information may opt for an exceptionally low privacy budget, while an average user might select a moderately higher privacy budget that balances privacy and utility. We observe that privacy-preserving techniques such as DP-SGD [1] may Frameworks HAR Personalized TEE HE DP PPK Flatee [24] ✓ ✓ Chex-mix [25] ✓ ✓ ✓ Ada-PPFL [16] ✓ PPFL-NonIID [32] ✓ ✓ PFSCL [31] ✓ ✓ ✓ ClusterFL [26] ✓ ✓ LDP-Fed [38] ✓ experience performance degradation as the privacy budget decreases, while alternatives like DP-PATE [27] offer enhanced performance and privacy guarantees at the cost of increased computational complexity. Consequently, we recognize that the choice between these methods should be driven by specific use-case requirements, enabling us to effectively quantify privacy while accommodating diverse and non-uniform privacy preferences.…”
Section: Introductionmentioning
confidence: 88%
“…Trusted Execution Environments (TEE), which establish secure enclaves inside CPUs, are used to implement this approach. Several works have incorporated TEEs within federated aggregation servers to secure operations [25,24]. For example, in [25], the server's inference model is placed inside a TEE to ensure secure inference.…”
Section: Airtight Box-based Defensesmentioning
confidence: 99%
See 2 more Smart Citations
“…To facilitate gradient computing, the server regularly distributes its model status to the clients, but this white-box exposure of the model renders the server vulnerable to, e.g., poisoning or inversion attacks from malicious clients (Shokri et al, 2017;Xie et al, 2020;Geiping et al, 2020). With that, recent attempts are made to exploit trusted execution environments (TEEs) in FL, which can isolate the model status within a black-box secure area and significantly reduce the success rate of malicious evasion (Chen et al, 2020;Mo et al, 2021;Mondal et al, 2021). However, TEEs are highly memoryconstrained (Truong et al, 2021), while backpropagation is memory-consuming to restore intermediate states.…”
Section: Introductionmentioning
confidence: 99%