Differential privacy provides a rigorous framework to quantify data privacy, and has received considerable interest recently. A randomized mechanism satisfying (ǫ, δ)-differential privacy (DP) roughly means that, except with a small probability δ, altering a record in a dataset cannot change the probability that an output is seen by more than a multiplicative factor e ǫ . A well-known solution to (ǫ, δ)-DP is the Gaussian mechanism initiated by Dwork et al. [1] in 2006 with an improvement by Dwork and Roth [2] in 2014, where a Gaussian noise amount 2 ln 2 δ × ∆ ǫ of [1] or 2 ln 1.25 δ × ∆ ǫ of [2] is added independently to each dimension of the query result, for a query with ℓ2-sensitivity ∆. Although both classical Gaussian mechanisms [1], [2] explicitly assume 0 < ǫ ≤ 1 only, our review finds that many studies in the literature have used the classical Gaussian mechanisms under values of ǫ and δ where we show the added noise amounts of [1], [2] do not achieve (ǫ, δ)-DP. We obtain such result by analyzing the optimal (i.e., least) Gaussian noise amount σDP-OPT for (ǫ, δ)-DP and identifying the set of ǫ and δ where the noise amounts of classical Gaussian mechanisms are even less than σDP-OPT. The inapplicability of mechanisms of [1],[2] to large ǫ can also be seen from our result that σDP-OPT for large ǫ can be written as Θ 1 √ ǫ , but not Θ 1 ǫ . Since σDP-OPT has no closed-form expression and needs to be approximated in an iterative manner, we propose Gaussian mechanisms by deriving closed-form upper bounds for σDP-OPT. Our mechanisms achieve (ǫ, δ)-DP for any ǫ, while the classical Gaussian mechanisms [1], [2] do not achieve (ǫ, δ)-DP for large ǫ given δ. Moreover, the utilities of our proposed Gaussian mechanisms improve those of the classical Gaussian mechanisms [1],[2] and are close to that of the optimal yet more computationally expensive Gaussian mechanism.Since most mechanisms proposed in the literature for (ǫ, δ)-DP are obtained by ensuring a condition called (ǫ, δ)-probabilistic differential privacy (pDP), we also present an extensive discussion of (ǫ, δ)-pDP including deriving Gaussian noise amounts to achieve it.To summarize, our paper fixes the literature's long-time misuse of Gaussian mechanism [1], [2] for (ǫ, δ)-differential privacy and provides a comprehensive study for the Gaussian mechanisms.