2016
DOI: 10.1177/1094342015621367
|View full text |Cite
|
Sign up to set email alerts
|

An efficient parallelization of longest prefix match and application on data compression

Abstract: In this article, we describe a new approach to parallelize longest prefix match (LPM) algorithm through bit parallelism, also known as bit-vector approach. This approach makes use of bit-wise computations and leverages bit parallelism. The proposed parallel algorithm will be demonstrated in dictionary-based lossless data compression on general-purpose graphics processing units (GPGPUs). One of the main contributions of this work is redesigning the core part of the data compression algorithm and replacing it wi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…Vector data tiling can be applied to reduce data extent to enable value converting to an even smaller byte integer, with which a data rate saving of 80% can be achieved [20]. Lossless compression algorithms are mainly based on information theory, such as Huffman encoding, LZ series encoding [21,22], which are based on data redundancy evaluation and dictionary building. This kind of algorithm can achieve error-free decoding and is mostly used for file compression and data archiving, which can be applied in data compression of vector file or internet transmission.…”
Section: Introductionmentioning
confidence: 99%
“…Vector data tiling can be applied to reduce data extent to enable value converting to an even smaller byte integer, with which a data rate saving of 80% can be achieved [20]. Lossless compression algorithms are mainly based on information theory, such as Huffman encoding, LZ series encoding [21,22], which are based on data redundancy evaluation and dictionary building. This kind of algorithm can achieve error-free decoding and is mostly used for file compression and data archiving, which can be applied in data compression of vector file or internet transmission.…”
Section: Introductionmentioning
confidence: 99%