1998
DOI: 10.1109/4434.708258
|View full text |Cite
|
Sign up to set email alerts
|

Approaches for integrating task and data parallelism

Abstract: Languages that support both task and data parallelism are highly general and can exploit both forms of parallelism within a single application. However, integrating the two forms of parallelism cleanly and within a coherent programming model is di cult. This paper describes four languages (Fx, Opus, Orca, and Braid) that try to achieve such an integration and identi es several problems. The main problems are how to support both SPMD and MIMD style programs, how to organize the address space of a parallel progr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
54
0
1

Year Published

2000
2000
2017
2017

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 85 publications
(55 citation statements)
references
References 11 publications
0
54
0
1
Order By: Relevance
“…Parallel systems can be divided in task and data parallelism. Task parallelism refers to the system capable of executing different tasks simultaneously, while data parallelism indicates the execution of separated instances of the same task over different blocks of data at the same time [8]. In a simplistic view, parallel processing may have three main objectives: execute a work faster, increase the throughput of a program or reduce the latency of bottleneck codes.…”
Section: Parallel Processingmentioning
confidence: 99%
“…Parallel systems can be divided in task and data parallelism. Task parallelism refers to the system capable of executing different tasks simultaneously, while data parallelism indicates the execution of separated instances of the same task over different blocks of data at the same time [8]. In a simplistic view, parallel processing may have three main objectives: execute a work faster, increase the throughput of a program or reduce the latency of bottleneck codes.…”
Section: Parallel Processingmentioning
confidence: 99%
“…Finally, one could envision extending our application model to address the situation in which each divisible load application consists of a set of tasks linked by dependencies. This would be an attractive extension of the mixed task and data parallelism approach [20,28,3,33,17] to heterogeneous clusters and grids.…”
Section: Resultsmentioning
confidence: 99%
“…Moreover, Section 3.4 will show Hinch efficiently exploits parallel architectures. The major difficulties we encounter are combining task-and data-parallelism [19], supporting dynamic reconfiguration, and handling asynchronous events.…”
Section: Cpu + Gpu Combinationsmentioning
confidence: 99%
“…However, combining task-and data parallelism is difficult [19]. Hinch needs to perform load balancing, synchronize parallel tasks, and allow communication between tasks that concurrently use different resources.…”
Section: Parallelismmentioning
confidence: 99%
See 1 more Smart Citation