“…Linear regression is of course a true workhorse of statistics, and there has been a significant body of work on the design of computationally and statistically efficient differentially private regression algorithms (see e.g., the recent surveys of Cai et al (2020); Wang (2018) and the references therein). Approaches include objective perturbation (Iyengar et al, 2019;Kifer et al, 2012;Zhang et al, 2012;Chaudhuri et al, 2011), output perturbation (Asi and Duchi, 2020;Iyengar et al, 2019;Zhang et al, 2017;Jain and Thakurta, 2014), gradient perturbation (Abadi et al, 2016;Bassily et al, 2014), subsample-and-aggregate (Barrientos et al, 2019;Dwork and Smith, 2010), and sufficient statistics perturbation (Alabi et al, 2020;Wang, 2018;McSherry and Mironov, 2009). Additionally, several works study generalizations of such mechanisms to Generalized Linear Models (GLMs) (Kulkarni et al, 2021;Iyengar et al, 2019;Jain and Thakurta, 2014;Kifer et al, 2012).…”