2015 10th International Conference on Availability, Reliability and Security 2015
DOI: 10.1109/ares.2015.32
|View full text |Cite
|
Sign up to set email alerts
|

Gradually Improving the Forensic Process

Abstract: At the time of writing, one of the most pressing problems for forensic investigators is the huge amount of data to analyze per case. Not only the number of devices increases due to the advancing computerization of everydays life, but also the storage capacity of each and every device raises into multiterabyte storage requirements per case for forensic working images. In this paper we improve the standardized forensic process by proposing to use file deduplication across devices as well as file whitelisting rig… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 28 publications
0
4
0
Order By: Relevance
“…In this section, we first describe the theoretical approach as described in the original paper by Neuner et al , . This first part is described an artificial scenario.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we first describe the theoretical approach as described in the original paper by Neuner et al , . This first part is described an artificial scenario.…”
Section: Discussionmentioning
confidence: 99%
“…This paper is the extended version of the paper by Neuner et al , and was funded by COMET K1, FFG ‐ Austrian Research Promotion Agency and by FFG grant 846070: SpeedFor. Additionally, we want to thank the employees of SBA Research for giving us access to their data and process power.…”
Section: Acknowledgementsmentioning
confidence: 99%
“…This technique can work on top of the DFaaS framework. The employment of data deduplication techniques effectively reduces the data required to be processed during the investigation, and potentially blacklisting enables a faster illegal file artefact detection [21,25]. Experimentation has proven the significant savings of the storage and the volumes of data processed in [21] and [8].…”
Section: Dfaas and Data Deduplicationmentioning
confidence: 99%
“…Based on the DFaaS paradigm, data deduplication was proposed for reducing the time repeatedly acquiring and analysing known file artefacts [25]. Several experiments have been performed that proves the increased efficiency created by a deduplication system [8,21,28]. Creation of centralised artefact whitelisting eliminates the known, benign operating system and application files, and also eliminates known, benign user created files.…”
Section: Introductionmentioning
confidence: 99%