2017
DOI: 10.1007/978-3-319-64203-1_28
|View full text |Cite
|
Sign up to set email alerts
|

Energy-Driven Straggler Mitigation in MapReduce

Abstract: Energy consumption is an important concern for large-scale data-centers, which results in huge monetary cost for data-center operators. Due to the hardware heterogeneity and contentions between concurrent workloads, straggler mitigation is important to many Big Data applications running in large-scale data-centers and the speculative execution technique is widely-used to handle stragglers. Although a large number of studies have been proposed to improve the performance of Big Data applications using speculativ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
1

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 16 publications
0
8
0
Order By: Relevance
“…In certain case, Precision is low and only 55% of those detected are actual stragglers and the Recall is also relatively low at 56%. For the same case, the hierarchical approach [14], i.e., a green-driven straggler detection mechanism, achieves a Precision of 99% and a Recall of 29%. This increase in precision can be translated to achieve lower execution time and energy consumption, and thus higher performance and energy efficiency; compared to the default Hadoop mechanism, execution time and energy consumption are reduced by almost 32% and 31%, respectively.…”
Section: Introductionmentioning
confidence: 90%
See 2 more Smart Citations
“…In certain case, Precision is low and only 55% of those detected are actual stragglers and the Recall is also relatively low at 56%. For the same case, the hierarchical approach [14], i.e., a green-driven straggler detection mechanism, achieves a Precision of 99% and a Recall of 29%. This increase in precision can be translated to achieve lower execution time and energy consumption, and thus higher performance and energy efficiency; compared to the default Hadoop mechanism, execution time and energy consumption are reduced by almost 32% and 31%, respectively.…”
Section: Introductionmentioning
confidence: 90%
“…Throughout our experiments, we examined three straggler detection mechanisms, two from the literature: Default [7] and LATE [26] mechanisms. The third mechanism we consider is Hierarchical [14], which is a green straggler detection scheme that is applied hierarchically on the top of Default. Hereafter, we provide brief descriptions of the three mechanisms.…”
Section: Straggler Detection Mechanismsmentioning
confidence: 99%
See 1 more Smart Citation
“…We find that the difference between the execution time of two copies in the same job can be more than 10x. Regarding the energy consumption, previous studies [14,15] have shown that there exists a trade-off between the performance and energy consumption when allocating speculative copies to nodes with different numbers of running tasks.…”
Section: Where To Launch? Heterogeneity Has To Be Consideredmentioning
confidence: 99%
“…In conclusion, it is important to consider the impact of heterogeneity on performance and energy consumption when making speculative copy allocation decisions. However, this might not be effective if it is done passively as shown in [15]. Hence, this motivates our window-based reservation technique.…”
Section: Where To Launch? Heterogeneity Has To Be Consideredmentioning
confidence: 99%