“…In machine learning, and specifically in the nascent field of federated learning (Konečnỳ et al, 2016) (see, e.g., (Kairouz et al, 2019) for a recent survey), private summation enables private Stochastic Gradient Descent (SGD), which in turn allows the private training of deep neural networks that are guaranteed not to overfit to any user-specific information. Moreover, summation is perhaps the most primitive functionality in database systems in general, and in private implementations in particular (see, e.g., (Kotsogiannis et al, 2019;Wilson et al, 2019;Suresh et al, 2017)).…”