2018 10th International Conference on Measuring Technology and Mechatronics Automation (ICMTMA) 2018
DOI: 10.1109/icmtma.2018.00009
|View full text |Cite
|
Sign up to set email alerts
|

A Parallel Optimization of the Fast Algorithm of Convolution Neural Network on CPU

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…Where, yjk = yk min + rand (0,1) (yk maxyk min) j= no of ways to find solution j= no of trails (5) Distributions of probability: -Find potential candidate likelihood. If used bees are searched, the info of the highest hierarchy is passed.…”
Section: Reducementioning
confidence: 99%
See 1 more Smart Citation
“…Where, yjk = yk min + rand (0,1) (yk maxyk min) j= no of ways to find solution j= no of trails (5) Distributions of probability: -Find potential candidate likelihood. If used bees are searched, the info of the highest hierarchy is passed.…”
Section: Reducementioning
confidence: 99%
“…The Particle Swarm optimization techniques instead of Ant Bee Colony Optimization with map reduce for maintaining clustering quality. New techniques need to be applied to parallel computing principles in order to be able to scale with that data set sizes [5]. Some algorithms that are capable of handling large and semi-structured data, are K-Means and ISODATA.…”
Section: Introductionmentioning
confidence: 99%
“…Deep learning algorithms based on convolutional neural networks involve many floating point operations. CNN models mostly run in the environment of CPU [16] and GPU [17,18]. Although GPU can achieve real-time processing, its expensive cost and high power consumption make it difficult to satisfy the application requirements of edge computing scenarios.…”
Section: Introductionmentioning
confidence: 99%