Estimation of the number of signals in the presence of noise is an important problem in several areas of statistical signal processing. There are a lot of modern works on the design of an optimal solution to this problem in terms of some criteria. Each criterion generates a model order selection (MOS) algorithm.However, the minimum error probability criterion has not received significant attention, despite the fact that errors in the estimation of the number of signals might directly affect the performance of the signal processing system as a whole. In this paper, we propose a new approach to the design of MOS algorithms partially based on the minimum error probability criterion.Also, we pay a lot of attention to the performance and consistency analysis of the MOS algorithms. In this study, an abridged error probability is used as a universal performance measure of the MOS algorithms. We propose a theoretical framework that allows to obtain closed-form expressions for the abridged error probabilities of a wide range of MOS algorithms. Moreover, a parametric consistency analysis of the presented MOS algorithms is provided.Next, we use the obtained results to provide a parametric optimization of the presented MOS algorithms.Finally, we study a quasilikelihood (QL) approach to the design and analysis of the MOS algorithms. The proposed theoretical framework is used to obtain the abridged error probabilities as functions of the unknown signal parameter. These functions, in its turn, allow us to find the scope of the QL approach.