This article studies the problem of exponential passivity and H ∞ performance for neural networks (NNs) under the effect of leakage and distributed delays. A novel criterion for achieving exponential passivity in these neural networks is derived. Moreover, we establish new criteria for analyzing the exponential stability and H ∞ performance of the system. Utilizing the Lyapunov-Krasovskii stability theory, we employ an integral inequality to assess the derivative of the Lyapunov-Krasovskii functionals, often referred to as LKFs. This estimation involves constructing novel LKFs that incorporate triple and quadruple integral terms. Furthermore, we obtain results contingent upon the leakage delay and the upper bound of the time-varying delays. To provide context, we conduct comparisons to contrast with existing results. In order to demonstrate the usefulness of the findings, a few numerical examples are provided together with computer simulations.