The radial basis function (RBF) neural network is a type of universal approximator, and has been widely used in various fields. Improving the training speed and compactness of RBF networks are critical for promoting their applications. In the present study, we propose a simple, fast, and effective RBF networks training method, which is based on the residual extreme points and their neighborhoods (thus called the REN method for short in this paper). The REN method calculates RBF centers and widths through a two-level iterative process, and realizes two main functionalities, namely 1) adding multiple centers within one pass through the whole data set, and 2) calculating RBF widths specifically for each center. The use of this algorithm does not need any parameter adjustments, and the models for approximation or classification can be obtained by only one run. The performance of the proposed REN algorithm is compared with the classic and powerful orthogonal least squares (OLS) algorithm. By reaching the same accuracies, the REN algorithm trains RBF networks 50 and 320 times faster, in the chirp (0˜50 Hz, 2 s, 1 kHz, 2001 samples) and two-dimensional peaks (2401 samples) signal approximation tasks respectively, than the OLS algorithm does, and the number of centers obtained by the REN algorithm is reduced by half. When incorporating the same number of centers, the REN algorithm achieves accuracies up to 3 orders of magnitude higher than the best results obtained by the OLS algorithm. In the classification task of a real discrete breast cancer data, both methods result in accuracies comparable to many existent methods, but the REN algorithm has the advantages of fast training speeds and no requirements for parameter adjustments. The REN algorithm proposed in this study may potentially be used for tasks with large scale of data or applications that require high model performances.