We consider the problem of testing multiple quantum hypotheses {ρ ⊗n 1 , . . . , ρ ⊗n r }, where an arbitrary prior distribution is given and each of the r hypotheses is n copies of a quantum state. It is known that the minimal average error probability P e decays exponentially to zero, that is, P e = exp{−ξn + o(n)}. However, this error exponent ξ is generally unknown, except for the case that r = 2.In this paper, we solve the long-standing open problem of identifying the above error exponent, by proving Nussbaum and Szko la's conjecture that ξ = min i =j C(ρ i , ρ j ). The right-hand side of this equality is called the multiple quantum Chernoff distance, and C(ρ i , ρ j ) := max 0≤s≤1 {− log Tr ρ s i ρ 1−s j } has been previously identified as the optimal error exponent for testing two hypotheses, ρ ⊗n i versus ρ ⊗n j . The main ingredient of our proof is a new upper bound for the average error probability, for testing an ensemble of finite-dimensional, but otherwise general, quantum states. This upper bound, up to a states-dependent factor, matches the multiple-state generalization of Nussbaum and Szko la's lower bound. Specialized to the case r = 2, we give an alternative proof to the achievability of the binary-hypothesis Chernoff distance, which was originally proved by Audenaert et al.