“…As more data is collected, analyzed, and published by researchers, companies, and government agencies, concerns about the privacy of the participating individuals have become more prominent (Lane et al, 2014). While there have been many methods of statistical disclosure control to combat this problem (Hundepool et al, 2012), differential privacy (DP) (Dwork et al, 2006) has arisen as the state-of-the-art framework for privacy protection, and is currently being implemented by Google (Erlingsson et al, 2014), Apple (Tang et al, 2017), Microsoft (Ding et al, 2017), and the US Census (Abowd, 2018). Differential privacy is based on a notion of plausible deniability, and requires the introduction of additional noise, beyond sampling, into the analysis procedure.…”