Ultrasound instruments are suitable for large-scale examination of breast tumors, especially for women from Asian whose glands are dense. However, ultrasound images have the low contrast and resolution, blurred boundary and artifacts, which bring great difficulties to the interpretation of the junior doctor. However, traditional methods of breast ultrasound tumor recognition often use manually extracted features to gradually realize ROI region location and tumor classification with low accuracy, poor robustness and weak universality. Deep learning is limited to the location of tumor ROI region or the classification of a given tumor ROI region. In this paper, YOLOV3 algorithm is used for breast ultrasound tumor recognition, which could realize ROI localization and tumor classification at the same time. In addition, K-Means is optimized by K-Means++ and K-Mediods algorithm to generate anchor boxes of YOLOV3, and based on the Darknet-53 network structure of YOLOV3, ResNet and DenseNet are combined to design ResNet-DenseNet_Darknet-53. The proposed method is tested on the breast ultrasound tumor data set. Experiments show that the improved YOLOV3 algorithm shows better detection results on multiple evaluation indicators.