a b s t r a c tMany estimation, prediction, and learning applications have a dynamic nature. One of the most important challenges in machine learning is dealing with concept changes. Underlying changes may make the model designed on old data, inconsistent with new data. Also, algorithms usually specialize in one type of change. Other challenge is reusing previously acquired information in scenarios where changes may recur. This strategy improves the learning accuracy and reduces the processing time. Unfortunately, most existing learning algorithms to deal with changes are adapted on a batch basis. This process usually requires a long time, and such data may not reflect the current state of the system. However, even the system is adapted on a sample basis, existing algorithms may adapt slowly to changes and cannot conciliate old and new information. This paper proposes an On-line Weighted Ensemble (OWE) of regressor models which is able to learn incrementally sample by sample in the presence of several types of changes and simultaneously retain old information in recurring scenarios. The key idea is to keep a moving window that slides when a new sample is available. The error of each model on the current window is determined using a boosting strategy that assigns small errors to the models that predict accurately the samples predicted poorly by the ensemble. To handle recurring and non-recurring changes, OWE uses a new assignment of models' weights that takes into account the models' errors on the past and current windows using a discounting factor that decreases or increases the contribution of old windows. In addition, OWE launches new models if the system's accuracy is decreasing, and it can exclude inaccurate models over time. Experiments with artificial and industrial data reveal that in most cases OWE outperforms other state-of-the-art concept drift approaches.