In a recent paper, Gaunt (Ann I H Poincare Probab Stat 56:1484–1513, 2020) extended Stein’s method to limit distributions that can be represented as a function $$g:{\mathbb {R}}^d\rightarrow {\mathbb {R}}$$
g
:
R
d
→
R
of a centred multivariate normal random vector $$\Sigma ^{1/2}{\textbf{Z}}$$
Σ
1
/
2
Z
with $${\textbf{Z}}$$
Z
a standard d-dimensional multivariate normal random vector and $$\Sigma $$
Σ
a non-negative-definite covariance matrix. In this paper, we obtain improved bounds, in the sense of weaker moment conditions, smaller constants and simpler forms, for the case that g has derivatives with polynomial growth. We obtain new non-uniform bounds for the derivatives of the solution of the Stein equation and use these inequalities to obtain general bounds on the distance, measured using smooth test functions, between the distributions of $$g({\textbf{W}}_n)$$
g
(
W
n
)
and $$g({\textbf{Z}})$$
g
(
Z
)
, where $${\textbf{W}}_n$$
W
n
is a standardised sum of random vectors with independent components and $${\textbf{Z}}$$
Z
is a standard d-dimensional multivariate normal random vector. We apply these general bounds to obtain bounds for the Chi-square approximation of the family of power divergence statistics (special cases include the Pearson and likelihood ratio statistics), for the case of two cell classifications, that improve on existing results in the literature.