1996
DOI: 10.1155/1997/926796
|View full text |Cite
|
Sign up to set email alerts
|

Run‐Time and Compiler Support for Programming in Adaptive Parallel Environments

Abstract: For better utilization of computing resources, it is important to consider parallel programming environments in which the number of available processors varies at run-time. In this article, we discuss run-time support for data-parallel programming in such an adaptive environment. Executing programs in an adaptive environment requires redistributing data when the number of processors changes, and also requires determining new loop bounds and communication patterns for the new set of processors. We have develope… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

1998
1998
2015
2015

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 3 publications
0
8
0
Order By: Relevance
“…Adaptive Thread Management [21] optimizes the number of resources to data-parallel loops in automatically parallelized programs [20] during execution, according to the online comparison of current speedup against expect values. Adaptive Multiblock PARTI runtime library [15] follows a similar approach, providing great portability. Dynamic feedback [13] provides a compiler assisted optimization scheme, adapting to different compiler generated synchronization policy on alternating phases.…”
Section: Related Workmentioning
confidence: 99%
“…Adaptive Thread Management [21] optimizes the number of resources to data-parallel loops in automatically parallelized programs [20] during execution, according to the online comparison of current speedup against expect values. Adaptive Multiblock PARTI runtime library [15] follows a similar approach, providing great portability. Dynamic feedback [13] provides a compiler assisted optimization scheme, adapting to different compiler generated synchronization policy on alternating phases.…”
Section: Related Workmentioning
confidence: 99%
“…Adaptive task scheduling without parallelism feedback has also been studied empirically in the context of data-parallel languages [20,21]. This work focuses on compiler and runtime support for environments where the number of processors changes while the program executes.…”
Section: Related Workmentioning
confidence: 99%
“…With respect to them, ASSIST is able to target heterogeneous architectures, at a higher level of abstraction. The extensions of parallel programming languages (OpenMP [15], HPF [16]) proposed, are not enough flexible for a grid-like environment (e.g. they cannot acquire new PEs at run-time).…”
Section: Related Workmentioning
confidence: 99%