The paper addresses convergence of solutions for a class of differential inclusions termed differential variational inequalities (DVIs). Each DVI describes the dynamics of a neural network (NN) evolving in a closed hypercube of $R^n$ and defined by a continuously differentiable, {\em cooperative\/} and (possibly) nonreciprocal vector field $f$. The main result in the paper is that under a new condition on $f$, which is called strong Kamke-Muller condition, the solution semiflow generated by the DVI is strongly order preserving (SOP) and hence it satisfies a {\sc Limit Set Dichotomy} and enjoys generic convergence properties. A characterization of the SKM condition is given in terms of the interconnection properties of the Jacobian matrix of $f$. In the case where $f$ is an affine, or a linear, vector field the considered DVIs include two relevant classes of NNs, namely, the linear systems operating on a closed hypercube, also known as linear systems in saturated mode (LSSMs), and the full-range (FR) model of cellular neural networks (CNNs). By applying the results to LSSMs it is obtained that any cooperative LSSM with a (possibly) nonsymmetric and fully interconnected matrix is generically convergent. Analogous results hold for FRCNNs. All the obtained convergence results hold in the general case where the DVIs, and the LSSMs and FRCNNs, possess multiple equilibrium points