Proceedings of the Companion Conference on Genetic and Evolutionary Computation 2023
DOI: 10.1145/3583133.3590719
|View full text |Cite
|
Sign up to set email alerts
|

Byzantine-Resilient Learning Beyond Gradients: Distributing Evolutionary Search

Abstract: Modern machine learning (ML) models are capable of impressive performances. However, their prowess is not due only to the improvements in their architecture and training algorithms but also to a drastic increase in computational power used to train them.Such a drastic increase led to a growing interest in distributed ML, which in turn made worker failures and adversarial attacks an increasingly pressing concern. While distributed byzantine resilient algorithms have been proposed in a differentiable setting, no… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 47 publications
0
0
0
Order By: Relevance