Abstract. Large, sparse, Hermitian eigenvalue problems are still some of the most computationally challenging tasks. Despite the need for a robust, nearly optimal preconditioned iterative method that can operate under severe memory limitations, no such method has surfaced as a clear winner. In this research we approach the eigenproblem from the nonlinear perspective that helps us develop two nearly optimal methods. The first extends the recent Jacobi-Davidson-Conjugate-Gradient (JDCG) method to JDQMR, improving robustness and efficiency. The second method, Generalized-Davidson+1 (GD+1), utilizes the locally optimal Conjugate Gradient recurrence as a restarting technique to achieve almost optimal convergence. We describe both methods within a unifying framework, and provide theoretical justification for their near optimality. A choice between the most efficient of the two can be made at runtime. Our extensive experiments confirm the robustness, the near optimality, and the efficiency of our multimethod over other state-of-the-art methods.1. Introduction. The numerical solution of large, sparse, Hermitian and real symmetric eigenvalue problems is central to many applications in science and engineering. It is also one of the most time consuming tasks. Recently, electronic structure calculations, with eigenproblems at their core, have displaced Quantum Chromodynamics as the top supercomputer cycle user. The symmetric eigenvalue problem seems deceptively simple to solve, given well conditioned eigenvalues and a wealth of theoretical knowledge. Yet, these advantages have enabled researchers to push modeling accuracy to unprecedented levels, routinely solving for a few extreme eigenvalues of matrices of size more than a million, while an order of a billion has also been tried [48].The sheer size of these problems can only be addressed by iterative methods. At the same time, the size imposes limits on the memory available to these methods. Moreover, preconditioning becomes imperative to reduce the total number of iterations. While many eigenvalue iterative methods have been proposed, there is little consensus on which method is the best and in what situations. The unrestarted Lanczos method is known to be optimal for solving Hermitian eigenvalue problems but, unlike the Conjugate Gradient (CG) method for linear systems, it requires unlimited storage of its iteration vectors. With preconditioning, or under limited storage, the question of optimality remains open. Furthermore, there is a noticeable scarcity of high quality, general purpose software for preconditioned eigensolvers. In this research we seek an optimal, or nearly optimal, method that can utilize preconditioning and that can be implemented in a robust software that is also flexible.In the particular case of seeking one eigenpair, if the eigenvalue were known, solving a linear system using CG to obtain the corresponding eigenvector would yield the optimal Lanczos convergence. In practice both the eigenvalue and the eigenvector are unknown, so the appropriate w...