2013
DOI: 10.1007/978-3-642-36883-7_14
|View full text |Cite
|
Sign up to set email alerts
|

Data Leak Detection as a Service

Abstract: We describe a network-based data-leak detection (DLD) technique, the main feature of which is that the detection does not reveal the content of the sensitive data. Instead, only a small amount of specialized digests are needed. Our technique -referred to as the fuzzy fingerprint detection -can be used to detect accidental data leaks due to human errors or application flaws. The privacy-preserving feature of our algorithms minimizes the exposure of sensitive data and enables the data owner to safely delegate th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
34
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 45 publications
(34 citation statements)
references
References 27 publications
0
34
0
Order By: Relevance
“…In [7], network-based data-leak detection (DLD) technique, have important feature of which is that the detection does not require the information proprietor to show the content of the sensitive data. Instead, only a small amount of specialize digests are needed.…”
Section: Related Workmentioning
confidence: 99%
“…In [7], network-based data-leak detection (DLD) technique, have important feature of which is that the detection does not require the information proprietor to show the content of the sensitive data. Instead, only a small amount of specialize digests are needed.…”
Section: Related Workmentioning
confidence: 99%
“…However, organizing of data across hundreds or thousands of servers becomes considerably additional challenging after a performance standpoint. Substantial quantities of data must be transported to each newly created VM [1].Data distribution methods should focus on reducing the overall transmission time. Dropping the total burden of data relocate around the mitigation that reduces the collision of organized service such a service on be the demand to take time [2].…”
Section: Related Workmentioning
confidence: 99%
“…The work by Papadimitriou and Garcia-Molina [26] aims at finding the agents that leaked the sensitive data. Shu and Yao [32] presented privacy-preserving methods for protecting sensitive data in a non-MapReduce based detection environment. Shu et al [33] further proposed to accelerate screening transformed data leaks using GPU.…”
Section: Related Workmentioning
confidence: 99%
“…In the last category, the analysis proposed by Borders and Prakash [6] detects changes in network traffic patterns by searching for unjustifiable increase in HTTP traffic-flow volume, that indicates data exfiltration. The technique proposed by Shu and Yao [32] performs deep packet inspection to search for exposed outbound traffic that bears high similarity to sensitive data. Set intersection is used for the similarity measure.…”
mentioning
confidence: 99%