The evolution of random fields with known statistical properties is relatively straightforward to analyze in the linear regime, but becomes considerably more involved when nonlinearity, or interactions, are dominant. Previous works have shown that statistical physics techniques can be applied to predict the evolution of such systems. Here we study the evolution of random fields in a one-dimensional lattice of optical waveguides in the presence of strong nonlinearities, using the discrete nonlinear Schrödinger equation. Extending the 2009 work by Silberberg et al (Phys. Rev. Lett. 102 233904), we assume input fields with random amplitudes and phases. We derive analytic expressions for the system's statistical properties at thermodynamic equilibrium. Specifically, expressions for the probability density functions of field intensities, of fields' phase differences, and an expression for the field correlations. We express these properties in terms of the moments of the assumed statistical excitations, and verify the results with simulations. Most interestingly, we find that at thermodynamic equilibrium, correlations are formed through the interaction between sites. These exponentially decaying fields' correlations take a universal form that is essentially independent of excitation amplitudes but visibly shrink with increased spread of the exciting amplitudes. Our results are valid not only to nonlinear discrete optical systems, but extend also to the evolution of bosonic atoms in optical lattices in the high-occupancy limit that are governed by the equivalent Gross-Pitaevskii equation.