1998
DOI: 10.1145/279310.279321
|View full text |Cite
|
Sign up to set email alerts
|

Delta algorithms

Abstract: Delta algorithms compress data by encoding one file in terms of another. This type of compression is useful in a number of situations: storing multiple versions of data, displaying differences, merging changes, distributing updates, storing backups, transmitting video sequences, and others. This article studies the performance parameters of several delta algorithms, using a benchmark of over 1,300 pairs of files taken from two successive releases of GNU software. Results indicate that modern delta compression … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
45
0

Year Published

1998
1998
2019
2019

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 115 publications
(45 citation statements)
references
References 12 publications
0
45
0
Order By: Relevance
“…Delta encoding [Ajtai et al 2000;Hunt et al 1998;Tichy 1984] is a technique that attempts to encode the difference between two given strings (or objects) in the most efficient way possible. This technique is used extensively in versioning systems such as CVS [Cederqvist 1992], SCCS [Rochkind 1975] and RCS [Tichy 1985].…”
Section: Background and Related Workmentioning
confidence: 99%
“…Delta encoding [Ajtai et al 2000;Hunt et al 1998;Tichy 1984] is a technique that attempts to encode the difference between two given strings (or objects) in the most efficient way possible. This technique is used extensively in versioning systems such as CVS [Cederqvist 1992], SCCS [Rochkind 1975] and RCS [Tichy 1985].…”
Section: Background and Related Workmentioning
confidence: 99%
“…We now compare basic rsync with a version using zdelta and with a version using both zdelta and shorter hash values for v W Y 3 . For the experiments, we used the gcc and emacs data sets also used in [20], [15] , as this results in many additional matches that are not caught by the larger block sizes. In general, the results show that there is no one optimal block size, and that the choice depends heavily on the data sets.…”
Section: Some Basic Optimizations and Their Performancementioning
confidence: 99%
“…Our adaptation of PSDFs to compute the similarity of dynamic program slices employs an analogous strategy to delta change algorithms (Hunt et al 1998). Our current approach is not optimized and employs a ) ( 2 n Θ algorithm to compute the similarity between n dynamic program slices and the initial prediction slice.…”
Section: Case Study: Dunham Modelmentioning
confidence: 99%