2010 3rd Workshop on Many-Task Computing on Grids and Supercomputers 2010
DOI: 10.1109/mtags.2010.5699430
|View full text |Cite
|
Sign up to set email alerts
|

Improving Many-Task computing in scientific workflows using P2P techniques

Abstract: Large-scale scientific experiments are usually supported by scientific workflows that may demand high performance computing infrastructure. Within a given experiment, the same workflow may be explored with different sets of parameters. However, the parallelization of the workflow instances is hard to be accomplished mainly due to the heterogeneity of its activities. Many-Task computing paradigm seems to be a candidate approach to support workflow activity parallelism. However, scheduling a huge amount of workf… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0
1

Year Published

2011
2011
2015
2015

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 33 publications
(30 reference statements)
0
1
0
1
Order By: Relevance
“…Dias et al [12] proposes Heracles, an approach that uses Peer-to-Peer (P2P) techniques to manage distributed scientific workflows on huge clusters based on QoS criteria such as deadlines. However, different from the approach proposed in this paper, Heracles is focused on clusters and does not address cloud configuration issues.…”
Section: Related Work and Backgroundmentioning
confidence: 99%
“…Dias et al [12] proposes Heracles, an approach that uses Peer-to-Peer (P2P) techniques to manage distributed scientific workflows on huge clusters based on QoS criteria such as deadlines. However, different from the approach proposed in this paper, Heracles is focused on clusters and does not address cloud configuration issues.…”
Section: Related Work and Backgroundmentioning
confidence: 99%
“…32 Tabela A computação voluntária (CV) tenta aproveitar o recurso ocioso "doando-o" para um determinado projeto, istoé, o responsável pelo projeto transmite (geralmente via internet) para outro computador (doador) uma atividade para que este a processe. Esse paradigma tem o poder de fornecer muito processamento (ANDERSON, 2004), mas quando se trata de experimentos científicos existem muitos fatores que podem tornar esta solução pouco eficiente, como por exemplo, processamento de longa duração (DETHIER et al, 2008), grande volume de dados (DUAN et al, 2012), heterogenia de sistemas (DORNEMANN et al, 2012) e instabilidade dos computadores voluntários (DIAS et al, 2010).…”
Section: Lista De Tabelasunclassified