2012
DOI: 10.5121/ijdps.2012.3113
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Processing of cluster by Map Reduce

Abstract: MapReduce is a parallel programming model and an associated implementation introduced by This paper gives an overview of MapReduce programming model and its applications. The author has described here the workflow of MapReduce process. Some important issues, like fault tolerance, are studied in more detail. Even the illustration of working of Map Reduce is given.The data locality issue in heterogeneous environments can noticeably reduce the Map Reduce performance. In this paper, the author has addressed the il… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(1 citation statement)
references
References 1 publication
0
1
0
Order By: Relevance
“…The third phase of the flowchart involves splitting each digest, abstracted resources and events into multiple units (Chattopadhyay et al, 2011;Vaidya, 2012). Splitting here means partitioning each document into small clusters which are a logical grouping by decomposing it into metadata units (Subramaniyaswamy et al, 2015;Kruijf and Sankaralingam, 2007;Singh and Singh, 2015).…”
Section: Phase 3: Splitting Of the News Digests Library Resources' Abstracts And Event Abstractsmentioning
confidence: 99%
“…The third phase of the flowchart involves splitting each digest, abstracted resources and events into multiple units (Chattopadhyay et al, 2011;Vaidya, 2012). Splitting here means partitioning each document into small clusters which are a logical grouping by decomposing it into metadata units (Subramaniyaswamy et al, 2015;Kruijf and Sankaralingam, 2007;Singh and Singh, 2015).…”
Section: Phase 3: Splitting Of the News Digests Library Resources' Abstracts And Event Abstractsmentioning
confidence: 99%