2022 7th International Conference on Big Data Analytics (ICBDA) 2022
DOI: 10.1109/icbda55095.2022.9760359
|View full text |Cite
|
Sign up to set email alerts
|

An Asynchronous Distributed Training Algorithm Based on Gossip

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…Finally, gossip is emerging as candidate messaging technology for enabling the convergence of training process in distributed AI applications. For example, gossip is used to efficiently exchange model information between computing clusters [10], [40], [41]. In fact, centralized ML algorithms, adopting stochastic gradient descent, may suffer from variable latency.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, gossip is emerging as candidate messaging technology for enabling the convergence of training process in distributed AI applications. For example, gossip is used to efficiently exchange model information between computing clusters [10], [40], [41]. In fact, centralized ML algorithms, adopting stochastic gradient descent, may suffer from variable latency.…”
Section: Related Workmentioning
confidence: 99%
“…In fact, centralized ML algorithms, adopting stochastic gradient descent, may suffer from variable latency. The decentralized and asynchronous nature of gossip can successfully address this issue [10], [41].…”
Section: Related Workmentioning
confidence: 99%