2005
DOI: 10.1287/isre.1050.0056
|View full text |Cite
|
Sign up to set email alerts
|

Maximizing Accuracy of Shared Databases when Concealing Sensitive Patterns

Abstract: The sharing of databases either within or across organizations raises the possibility of unintentionally revealing sensitive relationships contained in them. Recent advances in data-mining technology have increased the chances of such disclosure. Consequently, firms that share their databases might choose to hide these sensitive relationships prior to sharing. Ideally, the approach used to hide relationships should be impervious to as many data-mining techniques as possible, while minimizing the resulting dist… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
68
0

Year Published

2008
2008
2020
2020

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 99 publications
(68 citation statements)
references
References 15 publications
0
68
0
Order By: Relevance
“…Systems enabling a flow control for accountable information would have to be designed, implemented, tested, and standardized along with the underlying data models, audit processes, and controls. Quality metrics could adapt existing paradigms for privacy metrics, such as k-anonymity for database privacy (Sweeney 2002 effectiveness of data perturbation approaches (Menon et al 2005 ;; Li and Sarkar 2006 ), information-theoretic metrics of data leakage, or randomization-based proof techniques inspired by differential privacy (Dwork 2011 ).…”
Section: Technical Challengesmentioning
confidence: 99%
“…Systems enabling a flow control for accountable information would have to be designed, implemented, tested, and standardized along with the underlying data models, audit processes, and controls. Quality metrics could adapt existing paradigms for privacy metrics, such as k-anonymity for database privacy (Sweeney 2002 effectiveness of data perturbation approaches (Menon et al 2005 ;; Li and Sarkar 2006 ), information-theoretic metrics of data leakage, or randomization-based proof techniques inspired by differential privacy (Dwork 2011 ).…”
Section: Technical Challengesmentioning
confidence: 99%
“…The Full-Exact method is performed together with a popular existing algorithm chosen from the literature for comparison purposes. The reference method is Menon's exact approach (Menon et al, 2005). The first part of the method proposed in Menon (Menon et al, 2005) decides about the minimum number of transactions that have to be sanitized, with the objective of minimizing accuracy using linear programming.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…The reference method is Menon's exact approach (Menon et al, 2005). The first part of the method proposed in Menon (Menon et al, 2005) decides about the minimum number of transactions that have to be sanitized, with the objective of minimizing accuracy using linear programming. The results are then used by the heuristic part for actual sanitization trying to make minimum harm on actual database.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…The user often needs to choose a preferred one by using highlevel domain experience. Generally, the receiver of shared data would likely be interested in a sharing agreement only when the data accuracy is reasonable [5]. In other words, the degree of data distortion cannot exceed a maximum level.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…A variety of other solutions have been proposed [5], [6]. Generally, the difficulty of data sanitization is not in concealing sensitive knowledge, but in reducing the accompanying side effects.…”
Section: Introductionmentioning
confidence: 99%