In this paper, we present a general framework to solve a fundamental problem in Random Matrix Theory (RMT), i.e., the problem of describing the joint distribution of eigenvalues of the sum A + B of two independent random Hermitian matrices A and B. Some considerations about the mixture of quantum states are basically subsumed into the above mathematical problem. Instead, we focus on deriving the spectral density of the mixture of adjoint orbits of quantum states in terms of Duistermaat-Heckman measure, originated from the theory of symplectic geometry. Based on this method, we can obtain the spectral density of the mixture of independent random states. In particular, we obtain explicit formulas for the mixture of random qubits. We also find that, in the two-level quantum system, the average entropy of the equiprobable mixture of n random density matrices chosen from a random state ensemble (specified in the text) increases with the number n. Hence, as a physical application, our results quantitatively explain that the quantum coherence of the mixture monotonously decreases statistically as the number of components n in the mixture. Besides, our method may be used to investigate some statistical properties of a special subclass of unital qubit channels.Mathematics Subject Classification. 22E70, 81Q10, 46L30, 15A90, 81R05