2018
DOI: 10.1007/s41060-018-0110-5
|View full text |Cite
|
Sign up to set email alerts
|

Large-scale asynchronous distributed learning based on parameter exchanges

Abstract: In many distributed learning problems, the heterogeneous loading of computing machines may harm the overall performance of synchronous strategies. In this paper, we propose an effective asynchronous distributed framework for the minimization of a sum of smooth functions, where each machine performs iterations in parallel on its local function and updates a shared parameter asynchronously. In this way, all machines can continuously work even though they do not have the latest version of the shared parameter. We… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…We then describe the application of these systems to remote sensing data of the selected river stretches (Section 2.3). All processing was done in R using the terra and insol libraries, with the exception of the CNN, which was run in python 3.9.7 using TensorFlow 2.9.1 (Joshi et al, 2018) with Keras 2.9.0.…”
Section: Methodsmentioning
confidence: 99%
“…We then describe the application of these systems to remote sensing data of the selected river stretches (Section 2.3). All processing was done in R using the terra and insol libraries, with the exception of the CNN, which was run in python 3.9.7 using TensorFlow 2.9.1 (Joshi et al, 2018) with Keras 2.9.0.…”
Section: Methodsmentioning
confidence: 99%
“…We then describe the application of these systems to remote sensing data of the selected river stretches (Section 2.3). All processing was done in R using theterra and insol libraries, with the exception of the CNN, which was run in python 3.9.7 using TensorFlow 2.9.1 (Joshi et al 2018) with Keras 2.9.0.…”
Section: Methodsmentioning
confidence: 99%