This paper investigates stability conditions of continuous-time Hopfield and firing-rate neural networks by leveraging contraction theory. First, we present a number of useful general algebraic results on matrix polytopes and products of symmetric matrices. Then, we give sufficient conditions for strong and weak Euclidean contractivity, i.e., contractivity with respect to the ℓ2 norm, of both models with symmetric weights and (possibly) non-smooth activation functions. Our contraction analysis leads to contraction rates which are logoptimal in almost all symmetric synaptic matrices. Finally, we use our results to propose a firing-rate neural network model to solve a quadratic optimization problem with box constraints.