2022 IEEE Smartworld, Ubiquitous Intelligence &Amp; Computing, Scalable Computing &Amp; Communications, Digital Twin, Privacy C 2022
DOI: 10.1109/smartworld-uic-atc-scalcom-digitaltwin-pricomp-metaverse56740.2022.00100
|View full text |Cite
|
Sign up to set email alerts
|

AFMeta: Asynchronous Federated Meta-learning with Temporally Weighted Aggregation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…After experimental analysis with the CIFAR-10 data set, the maximum accuracy of a model is found to be 91.15%. [6]This paper proposes Asynchronous Federated Meta-learning with Temporary Aggregation of Weighted (AFMeta-TW), one of the first works of asynchronous mode of federated meta-learning. Inorder to enhance model aggregation, it measures the staleness associated with the local models.An approach of temporarily weighted aggregation is utilized to combat temporal heterogeneity effectively and efficiently.The system aims to build an initial model with heterogeneous clients asynchronously by reducing learning time and embracing a meta-model for classification for a considerable performance.…”
Section: IImentioning
confidence: 99%
“…After experimental analysis with the CIFAR-10 data set, the maximum accuracy of a model is found to be 91.15%. [6]This paper proposes Asynchronous Federated Meta-learning with Temporary Aggregation of Weighted (AFMeta-TW), one of the first works of asynchronous mode of federated meta-learning. Inorder to enhance model aggregation, it measures the staleness associated with the local models.An approach of temporarily weighted aggregation is utilized to combat temporal heterogeneity effectively and efficiently.The system aims to build an initial model with heterogeneous clients asynchronously by reducing learning time and embracing a meta-model for classification for a considerable performance.…”
Section: IImentioning
confidence: 99%
“…EL techniques enable the integration of multiple machine learning algorithms, trained on the same dataset [10][11][12]. A variety of ensembling techniques, facilitating the most efficient base learner combination of either the same (homogeneous) or different (heterogeneous) type has been developed.…”
Section: Introductionmentioning
confidence: 99%