Within the scope of Machine Learning (ML), Instance Selection (IS) is a sampling process that consists of filtering noise and removing redundant data. In classification problems, IS implies a compromise between maximizing performance and reducing the sample size used for training. Complexity measures provide relevant information about the difficulty of classifying instances, which makes them appropriate for IS since they can capture, for example, noisy or borderline points. This paper introduces the complexity measure DDN, defined at three levels: instance, class, and dataset. The Dynamic Disagreeing Neighbors (DDN) of an instance is defined as the percentage of its nearest neighbors that belong to other classes. The DDN is based on the Nearest Centroid Neighbors (NCN) neighborhood computation, which is dynamically adjusted to the data distribution. In addition, the distance of each neighbor is taken into account so that those farther away are less influential and those closer are more influential for the instance complexity. The validity of the proposal is evaluated through a series of experiments where it is compared with the widely known k-Disagreeing Neighbors (kDN) in terms of stability, correlation with classification error, and performance in IS. The DDN has shown competitive, stable, and robust results throughout the experiments, generally improving on those obtained with the alternative.