2021
DOI: 10.1109/mis.2020.3005930
|View full text |Cite
|
Sign up to set email alerts
|

Differentially Private Collaborative Coupling Learning for Recommender Systems

Abstract: Coupling learning is designed to estimate, discover and extract the interactions and relationships among learning components. It provides insights into complex interactive data, and has been extensively incorporated into recommender systems to enhance the interpretability of sophisticated relationships between users and items. Coupling learning can be further fostered once the trending collaborative learning can be engaged to take advantage of the cross-platform data. To facilitate this, privacy-preserving sol… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
2
2

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 20 publications
0
8
0
Order By: Relevance
“…Another line of studies that approaches to privacy-preserving FL is through differential privacy (DP) mechanism [2,8,12,16,21,42,44,48,49]. The common practice of achieving differential privacy is based on additive noise calibrated to βˆ‡πΉ 's sensitivity S 2 βˆ‡πΉ .…”
Section: Learning With Differential Privacymentioning
confidence: 99%
See 3 more Smart Citations
“…Another line of studies that approaches to privacy-preserving FL is through differential privacy (DP) mechanism [2,8,12,16,21,42,44,48,49]. The common practice of achieving differential privacy is based on additive noise calibrated to βˆ‡πΉ 's sensitivity S 2 βˆ‡πΉ .…”
Section: Learning With Differential Privacymentioning
confidence: 99%
“…In the traditional FL, every participant owns its local training dataset, and updates the same global model 𝑀 π‘˜ via a parameter server using its local model/gradients. The local gradients βˆ‡πΉ (𝑀 π‘˜ , πœ‰ 𝑙,π‘˜ ) can be protected via either secure aggregation [4,30,31,34,40,47] or differential privacy mechanisms [2,8,21,42,44,48]. This process can be decentralized by replacing the parameter server with a peer-to-peer communication mechanism [22,39].…”
Section: Cgd Optimizationmentioning
confidence: 99%
See 2 more Smart Citations
“…There also have been some previous research efforts which have explored collaborative learning without exposing their trained models [24,33,46,47]. For example, Papernot et al [33] make use of transfer learning in combination with differential privacy to learn an ensemble of teacher models on data partitions, and then use these models to train a private student model.…”
Section: Related Workmentioning
confidence: 99%