The paper describes a new conjugate gradient algorithm for large scale nonconvex problems with box constraints. In order to speed up the convergence the algorithm employs a scaling matrix which transforms the space of original variables into the space in which Hessian matrices of functionals describing the problems have more clustered eigenvalues. This is done efficiently by applying limited memory BFGS updating matrices. Once the scaling matrix is calculated, the next few iterations of the conjugate gradient algorithms are performed in the transformed space. The box constraints are treated by the projection as previously used in [R.Pytlak, The efficient algorithm for large-scale problems with simple bounds on the variables, SIAM J. on Optimization, Vol. 8, 532-560, 19981. We believe that the preconditioned conjugate gradient algorithm gives more flexibility in achieving balance between the computing time and the number of function evaluations in comparison to a limited memory BFGS algorithm. The numerical results show that the proposed method is competitive to L-BFGS-B procedure.keywords: bound constrained nonlinear optimization problems, conjugate gradient algorithms, quasi-Newton methods.