Nonlinear least squares (NLS) problems arise in many applications. The common solvers require to compute and store the corresponding Jacobian matrix explicitly, which is too expensive for large problems. Recently, some Jacobian-free (or matrix free) methods were proposed, but most of these methods are not really Jacobian free since the full or partial Jacobian matrix still needs to be computed in some iteration steps. In this paper, we propose an effective real Jacobian free method especially for large NLS problems, which is realized by the novel combination of using automatic differentiation for J (x)v and J (x) T v along with the implicit iterative preconditioning ideas. Together, they yield a new and effective three-level iterative approach. In the outer level, the dogleg/trust region method is employed to solve the NLS problem. At each iteration of the dogleg method, we adopt the iterative linear least squares (LLS) solvers, CGLS or BA-GMRES method, to solve the LLS problem generated at each step of the dogleg method as the middle iteration. In order to accelerate the convergence of the iterative LLS solver, we propose an implicit inner iteration preconditioner based on the weighted Jacobi method. Compared to the existing Jacobian-free methods, our proposed three-level method need not compute any part of the Jacobian matrix explicitly in 123 J Sci Comput any iteration step. Furthermore, our method does not rely on the sparsity or structure pattern of the Jacobian, gradient or Hessian matrix. In other words, our method also works well for dense Jacobian matrices. Numerical experiments show the superiority of our proposed method.