Machine learning techniques have garnered significant attention in various engineering disciplines due to their potential and benefits. Specifically, in reservoir numerical simulations, the core process revolves around solving the partial differential equations delineating oil, gas, and water flow dynamics in porous media. Discretizing these partial differential equations via numerical methods is one cornerstone of this simulation process. The synergy between traditional numerical methods and machine learning can enhance the precision of partial differential equation discretization. Moreover, machine learning algorithms can be employed to solve partial differential equations directly, yielding rapid convergence, heightened computational efficiency, and accuracies surpassing 95%. This manuscript offers an overview of the predominant numerical methods in reservoir simulations, focusing on integrating machine learning methodologies. The innovations in fusing deep learning techniques to solve reservoir partial differential equations are illuminated, coupled with a concise discussion of their inherent advantages and constraints. As machine learning continues to evolve, its conjunction with numerical methods is poised to be pivotal in addressing complex reservoir engineering challenges.