2009
DOI: 10.1007/978-3-642-03869-3_91
|View full text |Cite
|
Sign up to set email alerts
|

Implementing Parallel Google Map-Reduce in Eden

Abstract: Recent publications have emphasised map-reduce as a general programming model (labelled Google map-reduce), and described existing high-performance implementations for large data sets. We present two parallel implementations for this Google map-reduce skeleton, one following earlier work, and one optimised version, in the parallel Haskell extension Eden. Eden's specific features, like lazy stream processing, dynamic reply channels, and nondeterministic stream merging, support the efficient implementation of th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 14 publications
(20 citation statements)
references
References 11 publications
0
18
0
Order By: Relevance
“…Using these skeletons with their remote data interfaces enables us to define a sequence consisting of a parallel map, a parallel transpose (realised using the all-to-all skeleton) and a second parallel map. This can be useful in an implementation of a parallel FFT skeleton [8] or a Google Map-Reduce skeleton [4]. In [4,8], corresponding parallel map-transpose skeletons have been defined as monolithic skeletons without composing simpler skeletons.…”
Section: Composing Predefined Skeletonsmentioning
confidence: 99%
See 1 more Smart Citation
“…Using these skeletons with their remote data interfaces enables us to define a sequence consisting of a parallel map, a parallel transpose (realised using the all-to-all skeleton) and a second parallel map. This can be useful in an implementation of a parallel FFT skeleton [8] or a Google Map-Reduce skeleton [4]. In [4,8], corresponding parallel map-transpose skeletons have been defined as monolithic skeletons without composing simpler skeletons.…”
Section: Composing Predefined Skeletonsmentioning
confidence: 99%
“…This can be useful in an implementation of a parallel FFT skeleton [8] or a Google Map-Reduce skeleton [4]. In [4,8], corresponding parallel map-transpose skeletons have been defined as monolithic skeletons without composing simpler skeletons. With the remote data interface, we can define the same skeleton as a composition of the three component skeletons.…”
Section: Composing Predefined Skeletonsmentioning
confidence: 99%
“…The MapReduce programming model is designed to process large volumes of data in parallel by dividing a job into a set of independent tasks [8]- [10]. The job is referred to here as a full MapReduce program, which is the execution of a Mapper or Reducer across a set of data.…”
Section: Related Workmentioning
confidence: 99%
“…It splits the input list into two halves, keeps the first half for local evaluation and creates a child process on PE 2 for sorting the second half. The remaining ticket list [3..noPe] is unshuffled into the two lists [3,5,7] and [4,6,8]. The first sublist is kept locally while the child process gets the second one.…”
Section: Divide-and-conquermentioning
confidence: 99%
“…Several parallel map implementations have been discussed and analysed in [33]. An Eden implementation of the large-scale map-and-reduce programming model proposed by Google [18] has been investigated in [6,4]. Hierarchical master-worker schemes with several layers of masters and submasters have been presented in [7].…”
Section: Further Readingmentioning
confidence: 99%