2012
DOI: 10.1007/s13748-012-0035-5
|View full text |Cite
|
Sign up to set email alerts
|

A survey of methods for distributed machine learning

Abstract: Traditionally, a bottleneck preventing the development of more intelligent systems was the limited amount of data available. Nowadays, the total amount of information is almost incalculable and automatic data analyzers are even more needed. However, the limiting factor is the inability of learning algorithms to use all the data to learn within a reasonable time. In order to handle this problem, a new field in machine learning has emerged: large-scale learning. In this context, distributed learning seems to be … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
68
0
1

Year Published

2014
2014
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 171 publications
(70 citation statements)
references
References 29 publications
1
68
0
1
Order By: Relevance
“…However, a bottleneck preventing such a big blessing is the inability of learning algorithms to use all the data to learn within a reasonable time. In this context, distributed learning seems to be a promising research since allocating the learning process among several workstations is a natural way of scaling up learning algorithms [42]. Different from the classical learning framework, in which one requires the collection of that data in a database for central processing, in the framework of distributed learning, the learning is carried out in a distributed manner [43].…”
Section: Representation Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…However, a bottleneck preventing such a big blessing is the inability of learning algorithms to use all the data to learn within a reasonable time. In this context, distributed learning seems to be a promising research since allocating the learning process among several workstations is a natural way of scaling up learning algorithms [42]. Different from the classical learning framework, in which one requires the collection of that data in a database for central processing, in the framework of distributed learning, the learning is carried out in a distributed manner [43].…”
Section: Representation Learningmentioning
confidence: 99%
“…With the advantage of distributed computing for managing big volumes of data, distributed learning avoids the necessity of gathering data into a single workstation for central processing, saving time and energy. It is expected that more widespread applications of the distributed learning are on the way [42]. Similar to distributed learning, another popular learning technique for scaling up traditional learning algorithms is parallel machine learning [48].…”
Section: Representation Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…В работе [13] дан обзор методов распределенного машинного обучения. Для автоматического машинного анализа больших объемов данных предлагается использовать метод крупномасштабного обучения.…”
Section: анализ литературных источниковunclassified
“…Since ANNs and SNNs simulate different characteristics of the biological neural networks, ANNs based deep neural networks (DNNs) and SNNs based learning systems are both being developed continuously for different purposes . Currently, implementations of neural networks are mostly based on the von Neumann computing system (VCS) such as CPU, GPU, and their cluster, which is powerful for logical computing but not efficient for neuronal and synaptic computing . Figure (a) illustrates the complexity relationship between the data environment and machine.…”
Section: Introductionmentioning
confidence: 99%