SMOTE is an oversampling approach previously proposed to solve the imbalanced data binary classification problem. SMOTE managed to improve the classification accuracy, however it needs to generate large number of synthetic instances, which is not efficient in terms of memory and time. To overcome such drawbacks, the Borderline-SMOTE (BSMOTE) is previously proposed to minimize the number of generated synthetic instances by generating such instances based on the borderline between the majority and minority classes. Unfortunately, BSMOTE could not provide big savings regarding the number of generated instances, trading to the classification accuracy. To improve BSMOTE accuracy, this paper proposes an Affinitive Borderline SMOTE (AB-SMOTE) that leverages the BSMOTE, and improves the quality of the generated synthetic data by taking into consideration the affinity of the borderline instances. Experiments' results show the AB-SOMTE, when compared with BSMOTE, managed to produce the most accurate results in the majority of the test cases adopted in our study.