Real-time detection of fabric defects is a fairly critical part of industrial production. However, there are still some key issues to be solved in practical detection production, such as low detection speed and delays in traditional cloud detection. To address these issues, in this paper, a new detection network architecture, called YOLOV4-TinyS, is proposed. Firstly, the k-medoids clustering algorithm is used to improve the matching of anchor points and ground truths for datasets with great differences. Secondly, the residual structure is changed to reduce the complexity of the network structure, and a depth-separable convolution is used instead of partial convolution to improve the detection speed. Thirdly, the output feature layer is designed with shallow feature fusion to improve the location information extraction capability and use spatial attention and channel attention to improve the network efficiency. Finally, the whole network is trained and tested on four different datasets and extensive experiments show that the network has higher detection accuracy and faster detection speed compared to existing methods. Compared to the original network, YOLOV4-Tiny, the model complexity is reduced by 67.86% and the highest detection accuracy of 99.91% is achieved. Furthermore, the establishment of an efficient fabric inspection system and the validation of the method allows for the fast detection of fabric defects on conveyor belts. Thus, the proposed method has the potential to lay the foundation for the real-time detection of fabric defects and their application in industry.