2020
DOI: 10.1109/tnnls.2019.2921926
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Training for Multi-Layer Neural Networks by Consensus

Abstract: Over the past decade, there has been a growing interest in large-scale and privacy-concerned machine learning, especially in the situation where the data cannot be shared due to privacy protection or cannot be centralized due to computational limitations. Parallel computation has been proposed to circumvent these limitations, usually based on the master-slave and decentralized topologies, and the comparison study shows that a decentralized graph could avoid the possible communication jam on the central agent b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
27
0
8

Year Published

2020
2020
2022
2022

Publication Types

Select...
8

Relationship

5
3

Authors

Journals

citations
Cited by 33 publications
(35 citation statements)
references
References 30 publications
0
27
0
8
Order By: Relevance
“…O PTIMISATION and learning over distributed networks have been widely studied in recent years, owing to their significant potentials in many biological, engineering, and social applications [1][2][3][4][5][6]. Several critical limitations of the centralised methods can be addressed by the distributed algorithms: first, communicational requirement is relieved as information exchanges are confined to adjacent neighbours; second, local datasets can be kept private and do not need to be revealed to remote fusion centres; third, computational burdens are distributed into a set of agents, where each of them only needs to process its local datasets.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…O PTIMISATION and learning over distributed networks have been widely studied in recent years, owing to their significant potentials in many biological, engineering, and social applications [1][2][3][4][5][6]. Several critical limitations of the centralised methods can be addressed by the distributed algorithms: first, communicational requirement is relieved as information exchanges are confined to adjacent neighbours; second, local datasets can be kept private and do not need to be revealed to remote fusion centres; third, computational burdens are distributed into a set of agents, where each of them only needs to process its local datasets.…”
Section: Introductionmentioning
confidence: 99%
“…More recently, considerable interests have been concentrated on training neural networks using distributed datasets [4,[13][14][15]. One of the most important reasons is that we are now faced with the era of big data, and massive amounts of information are generated everyday and everywhere.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The distributed control problem of large complex network systems is always one of the major barriers in industrial applications. Various studies on distributed control have pointed to air traffic control, sensor networks, satellite formation control, multi-robot docking, distributed learning, and so on [1], [2]. The aim of distributed control is to design a control policy for each connected agent such that the evolutions of all the agents will finally contribute to the desired behaviour of the network system [3].…”
Section: Introductionmentioning
confidence: 99%
“…However, each agent needs to communicate with its neighbors many times to achieve consensus, which is computationally expensive. Liu et al [21] proposed a distributed neural networks framework based on the consensus algorithm, where all agents only need a single consensus step with their connected neighbors after every training update. Lian et al [22] theoretically analyzed the centralized and decentralized algorithms on their convergence rate, which shows that the decentralized method has a faster convergence rate than its centralized counterpart on high latency or low bandwidth system and can bring asymptotically linear speed up when more agents are available.…”
Section: Introductionmentioning
confidence: 99%