2015
DOI: 10.14419/jacst.v4i1.4009
|View full text |Cite
|
Sign up to set email alerts
|

A four-phase data replication algorithm for data grid

Abstract: Nowadays, scientific applications generate a huge amount of data in terabytes or petabytes. Data grids currently proposed solutions to large scale data management problems including efficient file transfer and replication. Data is typically replicated in a Data Grid to improve the job response time and data availability. A reasonable number and right locations for replicas has become a challenge in the Data Grid. In this paper, a four-phase dynamic data replication algorithm based on Temporal and Geographical … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…Therefore, chances of the same data center will be accessed near future for MPF i downloads are highly possible. Authors of [24] and [44] stated the cruciality of considering geographical locality in a replication environment whereby when a file was accessed recently in a particular storage node, local nearby data centers have a high possibility of being re-accessed. The researchers undoubtedly agreed that placing replica copies in the data center with a high frequency of specific files (popular candidate file) is ineffective, instead, it is more recommended to place popular files in the popular data center.…”
Section: A Accumulation Of User Merit (μ)mentioning
confidence: 99%
“…Therefore, chances of the same data center will be accessed near future for MPF i downloads are highly possible. Authors of [24] and [44] stated the cruciality of considering geographical locality in a replication environment whereby when a file was accessed recently in a particular storage node, local nearby data centers have a high possibility of being re-accessed. The researchers undoubtedly agreed that placing replica copies in the data center with a high frequency of specific files (popular candidate file) is ineffective, instead, it is more recommended to place popular files in the popular data center.…”
Section: A Accumulation Of User Merit (μ)mentioning
confidence: 99%