The multicategory support vector machine (MSVM) has been widely used for multicategory classification. Despite its widespread popularity, regular MSVM cannot provide direct probabilistic results and suffers from excessive computational cost, as it is formulated on the hinge loss function and it solves a sum-to-zero constrained quadratic programming problem. In this study, we propose a general refinement of regular MSVM, termed as the simplex-based proximal MSVM (SPMSVM). Our SPMSVM uses a novel family of squared error loss functions in place of the hinge loss and it removes the explicit sum-to-zero constraint by the simplex structure. Consequently, the SPMSVM only requires solving an unconstrained linear system, leading to closed-form solutions. In addition, the SPMSVM can be cast into a weighted regression problem so that it is scalable for largescale applications. Moreover, the SPMSVM naturally yields an estimate of the conditional category probability, which is more informative than regular MSVM. Theoretically, the SPMSVM is shown to include many existing MSVMs as its special cases, and its asymptotic and finite-sample statistical properties are well established. Simulations and real examples show that the proposed SPMSVM is a stable, scalable and competitive classifier.