2018
DOI: 10.2478/popets-2019-0003
|View full text |Cite
|
Sign up to set email alerts
|

RON-Gauss: Enhancing Utility in Non-Interactive Private Data Release

Abstract: A key challenge facing the design of differential privacy in the non-interactive setting is to maintain the utility of the released data. To overcome this challenge, we utilize the Diaconis-Freedman-Meckes (DFM) effect, which states that most projections of high-dimensional data are nearly Gaussian. Hence, we propose the RON-Gauss model that leverages the novel combination of dimensionality reduction via random orthonormal (RON) projection and the Gaussian generative model for synthesizing differentially-priva… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
17
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(17 citation statements)
references
References 101 publications
0
17
0
Order By: Relevance
“…Their approach releases a record only if it passes the privacy test of being similar to at least k records from the original dataset, hence the plausible deniability. A notable drawback of this approach is the high computational time and the large addition of noise, which makes it difficult to scale to high-dimensional datasets as pointed out in [15].…”
Section: Related Work On Private Data Synthesismentioning
confidence: 99%
See 1 more Smart Citation
“…Their approach releases a record only if it passes the privacy test of being similar to at least k records from the original dataset, hence the plausible deniability. A notable drawback of this approach is the high computational time and the large addition of noise, which makes it difficult to scale to high-dimensional datasets as pointed out in [15].…”
Section: Related Work On Private Data Synthesismentioning
confidence: 99%
“…Gaussian models are widely used for data modelling, independently of the properties of the data itself, as the structure of the data can be accurately estimated with only the mean and covariance of the data. Recently Chanyaswad, Liu and Mittal have proposed a differentially-private method to generate highdimensional private data through the use of random orthonormal projection (RON) and Gaussian models named RON-Gauss [15]. RON is a dimensionality reduction technique that lowers the amount of noise added to achieve differential privacy and achieves the Diaconis-Freedman-Meckes (DFM) effect stating that "under suitable conditions, most projections are approximately Gaussian".…”
Section: Related Work On Private Data Synthesismentioning
confidence: 99%
“…Since then, multiple privacy-preserving data release techniques have been proposed. [10][11][12][13][14][15][16] However, the methods have so far been limited to special cases, such as discrete data 10,[12][13][14][15]17 or having to draw a synthetic dataset from noisy histograms. 15,16 More recent work has employed more powerful models.…”
Section: Introductionmentioning
confidence: 99%
“…15,16 More recent work has employed more powerful models. 11,18,19 These methods have been shown to be much more efficient and general compared with previous attempts. However, these methods, as well as other data-sharing works, share a limitation: they are not able to use existing (prior) knowledge about the dataset.…”
Section: Introductionmentioning
confidence: 99%
“…Differential privacy has been widely adopted by the machine learning research community as the standard for privacy protection [13]- [23]. Some of the studies focus on a certain machine learning algorithm and develop a differentially private version of it, such as differentially private logistic regression [13], differentially private principal components analysis (PCA) [14], and differentially private matrix factorization [15], and some of them apply differential privacy to release data in a privacy preserving manner [16]- [18]. While some others focus on a general framework that can be applied to many algorithms in a uniform way, such as PrivGene [19], a novel framework for differentially private model fitting based on genetic algorithms, differential private SGD [20], which can be applied to algorithms optimized with stochastic gradient descent, and PATE [21], which outperforms existing approaches on MNIST and SVHN with an improved privacy analysis and a novel teacher-student framework.…”
Section: Introductionmentioning
confidence: 99%