Critical questions in neuroscience and machine learning can be addressed by establishing strong stability, robustness, entrainment, and computational efficiency properties of neural network models. The usefulness of such strong properties motivates the development of a comprehensive contractivity theory for neural networks.This paper makes two sets of contributions. First, we develop novel general results on non-Euclidean matrix measures and nonsmooth contraction theory. Regarding 1/ ∞ matrix measures, we show their quasiconvexity with respect to positive diagonal weights, their monotonicity with respect to principal submatrices, and provide closed form expressions for certain matrix polytopes. These results motivate the introduction of M -Hurwitz matrices, i.e., matrices whose Metzler majorant is Hurwitz. Regarding nonsmooth contraction theory, we show that the one-sided Lipschitz constant of a Lipschitz vector field is equal to the essential supremum of the matrix measure of its Jacobian. Second, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur'e and other models. For each model, we compute the optimal contraction rate and weighted non-Euclidean norm via a linear program or, in some special cases, via an M -Hurwitz condition on the synaptic matrix. Our analysis establishes also absolute contraction and total contraction.