2013
DOI: 10.1016/j.procs.2013.09.295
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Analysis of Data Privacy and Utility Parameter Adjustment, Using Machine Learning Classification as a Gauge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(18 citation statements)
references
References 13 publications
0
18
0
Order By: Relevance
“…The Filtered x-CEG, an adaptation of the Comparative x-CEG heuristic model outlined in Mivule and Turner (2013), is suggested [6]. Signal processing techniques, such as, discrete cosine transforms are used in the Filtered x-CEG, illustrated in Figure 1, unlike the model in [6], that does not involve signal processing methods [17].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The Filtered x-CEG, an adaptation of the Comparative x-CEG heuristic model outlined in Mivule and Turner (2013), is suggested [6]. Signal processing techniques, such as, discrete cosine transforms are used in the Filtered x-CEG, illustrated in Figure 1, unlike the model in [6], that does not involve signal processing methods [17].…”
Section: Methodsmentioning
confidence: 99%
“…Consequently, usability, would be a measure of the closeness between the original and privatized data [7]. However, in this study, the classification error is used as a gauge for data privacy and usability quantification [6]. On the subject of discrete cosine transforms and data privacy, studies have mostly been done in the image and audio processing areas, with focus on access control instead of confidentiality [8] [9][10] [11].…”
Section: Background and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Perfect utility can be achieved by publishing data as received but this has no privacy at all. In (Mivule & Turner, 2013), the more confidential data is , the more likely that the privatized data will decline in utility and therefore may become useless. (Doka et al, 2015) argue that utility may be achieved at the expense of runtime since the anonymization process is a one-time process.…”
Section: Data Utility Vs Privacymentioning
confidence: 99%
“…Moreover, the significant amount of randomization produced by certain DP algorithms results in low data utility. Existing perturbation mechanisms often ignore the connection between utility and privacy, even though improvement of one leads to deterioration of the other [32]. Furthermore, the inability to efficiently process high volumes of data and data streams makes the existing methods unsuitable for privacy-preservation in smart cyber-physical systems.…”
Section: Introductionmentioning
confidence: 99%