We propose a new analytic method for comparing constant gain adaptive signal processing algorithms. Specifically, estimates of the convergence speed of the algorithms allow for the definition of a local measure of performance, called the efficacy, that can be theoretically evaluated. By definition, the efficacy is consistent with the fair comparison techniques currently used in signal processing applications. Using the efficacy as a performance measure, we prove that the LMS-Newton algorithm is optimum and is, thus, the fastest algorithm within a very rich algorithmic class. Furthermore, we prove that the regular LMS is better than any of its variants that apply the same nonlinear transformation on the elements of the regression vector (such as signed regressor, quantized regressor, etc.) for an important class of input signals. Simulations support all our theoretical conclusions.