The health of the trees in the forest affects the ecological environment, so timely detection of Standing Dead Trees (SDTs) plays an important role in forest management. However, due to the large spatial scope of forests, it is difficult to find SDTs through conventional approaches such as field inventories. In recent years, the development of deep learning and Unmanned Aerial Vehicle (UAV) has provided technical support for low-cost real-time monitoring of SDTs, but the inability to fully utilize global features and the difficulty of small-scale SDTs detection have brought challenges to the detection of SDTs in visible light images. Therefore, this paper proposes a multi-scale attention mechanism detection method for identifying SDTs in UAV RGB images. This method takes Faster-RCNN as the basic framework and uses Swin-Transformer as the backbone network for feature extraction, which can effectively obtain global information. Then, features of different scales are extracted through the feature pyramid structure and feature balance enhancement module. Finally, dynamic training is used to improve the quality of the model. The experimental results show that the algorithm proposed in this paper can effectively identify the SDTs in the visible light image of the UAV with an accuracy of 95.9%. This method of SDTs identification can not only improve the efficiency of SDTs exploration, but also help relevant departments to explore other forest species in the future.