We provide non-asymptotic convergence rates of the Polyak-Ruppert averaged stochastic gradient descent (SGD) to a normal random vector for a class of twice-differentiable test functions. A crucial intermediate step is proving a non-asymptotic martingale central limit theorem (CLT), i.e., establishing the rates of convergence of a multivariate martingale difference sequence to a normal random vector, which might be of independent interest. We obtain the explicit rates for the multivariate martingale CLT using a combination of Stein's method and Lindeberg's argument, which is then used in conjunction with a non-asymptotic analysis of averaged SGD proposed in [PJ92]. Our results have potentially interesting consequences for computing confidence intervals for parameter estimation with SGD and constructing hypothesis tests with SGD that are valid in a non-asymptotic sense. * Authors are listed in alphabetical order. This paper provides a solution for an open problem formulated at the AIM workshop on Stein's method and Applications in High-dimensional Statistics, 2018 (problem (4) in page 2 of https://aimath.org/pastworkshops/steinhdrep.pdf by KB). We thank Jay Bartroff, Larry Goldstein, Stanislav Minsker, and Gesine Reinert for organizing a stimulating workshop, and the participants for several discussions. MAE is partially funded by CIFAR AI Chairs program at the Vector Institute.