2023
DOI: 10.1609/aaai.v37i9.26245
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Federated Neural Matching That Completes Full Information

Abstract: Federated learning is a contemporary machine learning paradigm where locally trained models are distilled into a global model. Due to the intrinsic permutation invariance of neural networks, Probabilistic Federated Neural Matching (PFNM) employs a Bayesian nonparametric framework in the generation process of local neurons, and then creates a linear sum assignment formulation in each alternative optimization iteration. But according to our theoretical analysis, the optimization iteration in PFNM omits global in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…It solves the matched averaging objective by iterative optimization. Through theoretical analysis, Xiao and Cheng [41] found that global information can be omitted by PFNM. To fix this missing global information issue, an algorithm that conducts neural aggregation with full information (NAFI) is developed.…”
Section: Bayesian Methodsmentioning
confidence: 99%
“…It solves the matched averaging objective by iterative optimization. Through theoretical analysis, Xiao and Cheng [41] found that global information can be omitted by PFNM. To fix this missing global information issue, an algorithm that conducts neural aggregation with full information (NAFI) is developed.…”
Section: Bayesian Methodsmentioning
confidence: 99%