Ensuring the positive definiteness and avoiding illconditioning of the Hessian update in the stochastic Broyden-Fletcher-Goldfarb-Shanno (BFGS) method are significant in solving nonconvex problems. This paper proposes a novel stochastic version of damped and regularized BFGS method for addressing the above problems. While the proposed regularized strategy helps to prevent the BFGS matrix from being close to singularity, the new damped parameter further ensures positivity of the product of correction pairs. To alleviate the computational cost of the stochastic LBFGS updates, and to improve its robustness, the curvature information is updated using the averaged iterate at spaced intervals. The effectiveness of the proposed method is evaluated through the logistic regression and Bayesian logistic regression problems in machine learning. Numerical experiments are conducted by using both synthetic dataset and several real datasets. The results show that the proposed method generally outperforms the stochastic damped limited memory BFGS (SdLBFGS) method. In particular, for problems with small sample sizes, our method has shown superior performance and is capable of mitigating ill-conditioned problems. Furthermore, our method is more robust to the variations of the batch size and memory size than the SdLBFGS method.Index Terms-nonconvex optimization, stochastic quasi-Newton method, LBFGS, damped parameter, nonconjugate exponential models, variational inference.