2008
DOI: 10.1002/cpe.1298
|View full text |Cite
|
Sign up to set email alerts
|

Toward replication in grids for digital libraries with freshness and correctness guarantees

Abstract: SUMMARYBuilding digital libraries (DLs) on top of data grids while facilitating data access and minimizing access overheads is challenging. To achieve this, replication in a Grid has to provide dedicated features that are only partly supported by existing Grid environments. First, it must provide transparent and consistent access to distributed data. Second, it must dynamically control the creation and maintenance of replicas. Third, it should allow higher replication granularities, i.e. beyond individual file… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2008
2008
2009
2009

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(17 citation statements)
references
References 17 publications
0
17
0
Order By: Relevance
“…Commonly, the freshness of a replicated data set D is defined as the difference between D and the original data set D. Various freshness metrics can be used, such as version difference or the amount of modified data [5], [12] and the time difference [1], [16]. High freshness means that replicated data are close to the original data, and thus the replication service is reliable to a certain degree.…”
Section: Freshnessmentioning
confidence: 99%
“…Commonly, the freshness of a replicated data set D is defined as the difference between D and the original data set D. Various freshness metrics can be used, such as version difference or the amount of modified data [5], [12] and the time difference [1], [16]. High freshness means that replicated data are close to the original data, and thus the replication service is reliable to a certain degree.…”
Section: Freshnessmentioning
confidence: 99%
“…Full database replication has been heavily studied [2,5,21,6,19,1,9]. The traditional correctness criterion for data replication is one-copy-serializability (1CS) [4].…”
Section: Related Workmentioning
confidence: 99%
“…For instance, let us assume that there are three sites {s 1 1 and s 2 use their entire processing capacity for local work (R i = 0, L i = C at both sites). The amount of remote work at s 3 is R 3 = (C + C) * 0.75 = 1.5C.…”
Section: Analytical Modelmentioning
confidence: 99%
“…Many solutions have been proposed in distributed systems for managing replicas [13], [11], [9], [8], [5] and [12]. Some of them include freshness control [15], [7], [10] and [1]. We base our work on the Leg@net approach [7], since it offers update anywhere and freshness control features and does not require any modification of the underlying DBMS nor of the application source code.…”
Section: Introductionmentioning
confidence: 99%