Many signal processing problems involve a Generalized Linear Model (GLM), which is a type of linear model where the unknowns may be non-identically independently distributed (n.i.i.d.). Vector Approximate Message Passing for Generalized Linear Models (GVAMP) is a computationally efficient belief propagation technique used for Bayesian inference. However, the posterior variances obtained from GVAMP with limited complexity are only exact under the assumption of an independent and identically distributed (i.i.d.) prior, owing to the averaging operations involved. In numerous problems, it is beneficial not just to estimate the unknowns but also to obtain accurate posterior distributions. While VAMP, and especially AMP, are applicable to high-dimensional problems, many applications involve dimensions that are not excessively high, allowing for more complex operations. Furthermore, in finite dimensions, the asymptotic regime that leads to correct variances under certain measurement matrix model assumptions is not applicable. To overcome these challenges, we propose a revised version of GVAMP, named reGVAMP. This method provides a multivariate Gaussian posterior approximation, which includes interparameter correlations, and yields accurate posterior marginals requiring only the extrinsic distributions to become Gaussian.