Cloud Computing and Software Services 2010
DOI: 10.1201/ebk1439803158-c12
|View full text |Cite
|
Sign up to set email alerts
|

High-Performance Parallel Computing with Cloud and Cloud Technologies

Abstract: We present our experiences in applying, developing, and evaluating cloud and cloud technologies. First, we present our experience in applying Hadoop and DryadLINQ to a series of data/compute intensive applications and then compare them with a novel MapReduce runtime developed by us, named CGL-MapReduce, and MPI. Preliminary applications are developed for particle physics, bioinformatics, clustering, and matrix multiplication. We identify the basic execution units of the MapReduce programming model and categori… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2011
2011
2020
2020

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 26 publications
0
6
0
Order By: Relevance
“…Therefore, this application primarily concerned to develop an algorithm to forecast the Cryptocurrencies price prediction accuracy. Numerous research experts discussed cloud computing [15,17], Deep learning algorithms [18], cryptocurrency price prediction, Bitcoin [6,7,9] and data parallelism [19,20] separately as three different topics. Thus, it has a potential and significant correlation between these three approaches and can be experimented together to explain precise model for the Cryptocurrencies price prediction.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, this application primarily concerned to develop an algorithm to forecast the Cryptocurrencies price prediction accuracy. Numerous research experts discussed cloud computing [15,17], Deep learning algorithms [18], cryptocurrency price prediction, Bitcoin [6,7,9] and data parallelism [19,20] separately as three different topics. Thus, it has a potential and significant correlation between these three approaches and can be experimented together to explain precise model for the Cryptocurrencies price prediction.…”
Section: Discussionmentioning
confidence: 99%
“…The experimental results confirmed that the GPU4with data parallelism and the GPU8with data parallelism models can reduce the computation time which is approximately within 30 minutes for the large batch sizes. Few authors applied the Deep Learning approach with the parallel neural network [17], data parallelism [15] and Parallel Consensual Neural Networks [20] to reduce the computation time. Similarly, [19] has discussed the effect of traffic flow in cloud computing for the computation time using different types of parallel architectures.…”
Section: Discussionmentioning
confidence: 99%
“…We measured the performance and virtualization overhead of several MPI applications on the virtual environments in an earlier study [24]. Here we present extended performance results of using Apache Hadoop implementation of SW-G and Cap3 in a cloud environment by comparing Hadoop on Linux with Hadoop on Linux on Xen [26] para-virtualised environment.…”
Section: Performance In the Cloudmentioning
confidence: 96%
“…Note approaches like Condor have significant startup time dominating performance. For basic operations [24], we find Hadoop and Dryad get similar performance on bioinformatics, particle physics and the well known kernels. Wilde [25] has emphasized the value of scripting to control these (task parallel) problems and here DryadLINQ offers some capabilities that we exploited.…”
Section: Related Workmentioning
confidence: 99%
“…It also distinguishes between static data that does not change in the course of the iterations, and normal data which may change during each iteration. Ekanayake et al [14] Zaharia et al [41] also found that MapReduce is not suitable for many applications that need to reuse a working set of input data across parallel operations. They propose Spark, a framework that supports iterative applications, yet retains the scalability and fault tolerance of MapReduce.…”
Section: Related Workmentioning
confidence: 99%