In the framework of large-scale optimization problems, the standard BFGS method is not affordable due to memory constraints. The so-called limited-memory BFGS (L-BFGS) method is an adaption of the BFGS method for large-scale settings. However, the standard BFGS method and therefore the standard L-BFGS method only use the gradient information of the objective function and neglect function values. In this paper, we propose a new regularized L-BFGS method for solving large scale unconstrained optimization problems in which more available information from the function and gradient values are employed to approximate the curvature of the objective function. The proposed method utilizes a class of modified quasi-Newton equations in order to achieve higher order accuracy in approximating the second order curvature of the objective function. Under some standard assumptions, we provide the global convergence property of the new method. In order to provide an efficient method for finding global minima of a continuously differentiable function, a hybrid algorithm that combines a genetic algorithm (GA) with the new proposed regularized L-BFGS method has been proposed. This combination leads the iterates to a stationary point of the objective function with higher chance of being global minima. Numerical results show the efficiency and robustness of the new proposed regularized L-BFGS and its hybridized version with GA in practice.