In the field of machine learning, Extreme Learning Machine (ELM) has been widely used in classification and regression tasks. However, like many traditional machine learning algorithms, the classification results of ELM are often not good enough when facing imbalanced data. For this reason, we proposed an extreme learning machine algorithm with output weight adjustment called OWA-ELM, which can make the decision boundary of ELM move to majority classes, and improve the classification performance of imbalanced data. Specifically, in ELM, we add a reasonable increment Δ to the connection weights between hidden layer neurons and minority output neurons in ELM, so that the output value of minority output neurons increases. Finally, the classification accuracy of the minority class samples can be improved without significantly affecting the classification of the majority class samples. The performance of OWA-ELM was compared with ELM, WELM, CS-ELM and CCR-ELM. In the experiments on 22 data sets, OWA-ELM algorithm has achieved 9 times best and 4 times suboptimal results on G-mean. In F-measure, 13 best and 1 second best results were obtained. Therefore, the OWA-ELM algorithm is effective to deal with imbalanced data classification.