We present a fairly general framework for reducing (ε, δ) differentially private (DP) statistical estimation to its non-private counterpart. As the main application of this framework, we give a polynomial time and (ε, δ)-DP algorithm for learning (unrestricted) Gaussian distributions in R d . The sample complexity of our approach for learning the Gaussian up to total variation distance, matching (up to logarithmic factors) the best known informationtheoretic (non-efficient) sample complexity upper bound (Aden-Ali, Ashtiani, Kamath [1]). In an independent work, Kamath, Mouzakis, Singhal, Steinke, and Ullman [22] proved a similar result using a different approach and with O(d 5/2 ) sample complexity dependence on d.As another application of our framework, we provide the first polynomial time (ε, δ)-DP algorithm for robust learning of (unrestricted) Gaussians.