2008 International Conference on Multimedia and Ubiquitous Engineering (Mue 2008) 2008
DOI: 10.1109/mue.2008.49
|View full text |Cite
|
Sign up to set email alerts
|

Recovery of Damaged Compressed Files for Digital Forensic Purposes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 3 publications
0
12
0
Order By: Relevance
“…5 below. In the first phase, deflate-encoded data in file fragments are detected using the approach in [1,14]. A deflate-encoded data proportion is formed by a sequence of compressed blocks; each block includes header, Huffman table, and compressed data.…”
Section: Proposed Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…5 below. In the first phase, deflate-encoded data in file fragments are detected using the approach in [1,14]. A deflate-encoded data proportion is formed by a sequence of compressed blocks; each block includes header, Huffman table, and compressed data.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Therefore, they developed a tool called zsniff which tries to detect deflate header and decompress the compressed data in a file fragment. This method is quite similar to the bit-by-bit process in [14]. Moreover, in order to get a better opportunity in identifying at least one deflate-encoded block data, the data fragment size of 18 KiB was recommended.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, many security-related applications, such as ransomware detection, traffic analysis and digital forensics, generally do not have access to whole-file information, but rather work at the level of fragments of data. In these settings, the metadata that is required by parsers is not present or is incomplete [35]. Given this issue, a number of works have been looking at alternative tests to distinguish between encrypted and compressed content [12,14,23,30,32,34,41].…”
Section: Introductionmentioning
confidence: 99%
“…Lakhani [14] adjusted the construction rules of the encoder and dictionary table and verified the data by adding redundant bits. Park et al [15] used compression coding rules to detect corrupted ZIP compressed data. However, the exhaustive error correction methods can correct only 1 bit each time to ensure the speed of error correction.…”
Section: Introductionmentioning
confidence: 99%