In this paper, we investigate the benefit of intentionally added noise to observed data in various scenarios of Bayesian parameter estimation. For optimal estimators, we theoretically demonstrate that the Bayesian Cramér-Rao bound for the case with added noise is never smaller than for the original data, and the updated minimum mean-square error (MSE) estimator performs no better. This motivates us to explore the feasibility of noise benefit in some useful suboptimal estimators. Several Bayesian estimators established from one-bit-quantizer sensors are considered, and for different types of pre-existing background noise, optimal distributions are determined for the added noise in order to improve the performance in estimation. With a single sensor, it is shown that the optimal added noise for reducing the MSE is actually a constant bias. However, with parallel arrays of such sensors, bona fide optimal added noise, no longer a constant bias, is shown to reduce the MSE. Moreover, it is found that the designed Bayesian estimators can benefit from the optimal added noise to effectively approach the performance of the minimum MSE estimator, even when the assembled sensors possess different quantization thresholds. INDEX TERMS Bayesian estimator, Bayesian information, bona fide optimal noise, noise benefit, parameter estimation.