Breast Ultrasound (BUS) imaging is an essential tool for the early detection of breast cancer. The Breast Imaging Reporting and Data System (BI-RADS) in BUS images helps standardize the interpretation and reporting process by categorizing breast tumors into multiple classes, which enables radiologists to make more accurate diagnoses and treatment plans. However, most existing classification methods distinguish only between benign and malignant categories. In addition, features extracted by classic convolutional neural networks tend to be insufficient when subdividing BUS images into fine-grained BI-RADS classes, as they typically do not consider prior knowledge in medical applications, such as foreground shape. To address the above problems, we propose a novel fine-grained BI-RADS classification approach that integrates tumor edges to provide more efficient discriminative features. Firstly, weakly supervised pseudo-label generation: we detect coarse tumor edge regions utilizing a pre-trained PiDiNet and two novel loss functions based on prior knowledge from our dataset. The detected tumor edges are subsequently used as pseudo-labels for the next step. Secondly, co-training a tumor edge detection network and a BI-RADS classification network: edge images generated by the edge detection network are used as weight masks to highlight tumor edge regions as discriminative parts for better classification results, especially for categories with high similarities. The proposed method is evaluated on a BUS image dataset of 1061 images with BI-RADS categories. Experimental results indicate that the proposed method significantly improves over the baseline model by 4.73% in terms of top-1 accuracy.INDEX TERMS Weakly supervised learning, breast ultrasound, deep learning, BI-RADS classification, edge detection.