Proceedings of the 1st International Workshop on AI for Privacy and Security 2016
DOI: 10.1145/2970030.2970041
|View full text |Cite
|
Sign up to set email alerts
|

Data set operations to hide decision tree rules

Abstract: Abstract.1 This paper focuses on preserving the privacy of sensitive patterns when inducing decision trees. We adopt a record augmentation approach for hiding sensitive classification rules in binary datasets. Such a hiding methodology is preferred over other heuristic solutions like output perturbation or cryptographic techniques -which restrict the usability of the data -since the raw data itself is readily available for public use. We show some key lemmas which are related to the hiding process and we also … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 18 publications
0
7
0
Order By: Relevance
“…These two passes help us firstly hide all sensitive rules and secondly keep the sanitized tree close to the structure of the original decision tree. We note that the above two procedures had been fully described in previously published works [16,17].…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…These two passes help us firstly hide all sensitive rules and secondly keep the sanitized tree close to the structure of the original decision tree. We note that the above two procedures had been fully described in previously published works [16,17].…”
Section: Methodsmentioning
confidence: 99%
“…In our previous published work [16,17], the above procedure was greedy, mostly solving the issue for only one (tree) level of nodes, which often resulted in a non-minimum number of added instances, whereas a look ahead based solution would be able to take into account all levels up to the We used the information gain as the splitting heuristic. In order to hide the leaf which corresponds to the nine positive instances (to the right of N0), we changed the nine positive instances to negative ones and denoted this operation by (−9p,+9n).…”
Section: Lemma 1 the Entropy Of A Node Only Depends On The Ratio Of mentioning
confidence: 99%
See 3 more Smart Citations