Accurate spatial distribution of grassland degradation indicator species is of great significance for grassland degradation monitoring. In order to realize the intelligent remote sensing grassland degradation monitoring task, this paper collects remote sensing data of three degradation indicator species of desert grassland, namely, constructive species, dominant species, and companion species, through the UAV hyperspectral remote sensing platform, and proposes a multi-feature fusion (MFF) classification model. In addition, vertical convolution, horizontal convolution, and group convolution mechanisms are introduced to further reduce the number of model parameters and effectively improve the computational efficiency of the model. The results show that the overall accuracy and kappa coefficient of the model can reach 91.81% and 0.8473, respectively, and it also has better classification performance and computational efficiency compared to different deep learning classification models. This study provides a new method for high-precision and efficient fine classification study of degradation indicator species in grasslands.
The proliferation of grassland rodents has aggravated the degradation process of desert grassland. Additionally, it carries many viruses that threaten human and animal health. Accurate spatial distribution in grassland between rodent populations, vegetation, and bare soil is essential for developing rodent control measures. However, the traditional survey method of grassland rodent pest information is time-consuming and costly, and the period is long. In addition, satellite remote sensing cannot meet the accuracy requirements for the identification of grassland rat holes due to the limitation of spatial resolution. To realize intelligent grassland rodent infestation monitoring, this paper adopts an unmanned aerial vehicle hyperspectral remote sensing platform for data acquisition. Meanwhile, a transformer attention network (TAN) is proposed for grassland rodent infestation information extraction. The network adopts a two-stage feature extraction structure that effectively improves the classification performance of the model. In each stage, first, local features are extracted by a fixed convolution kernel to enhance detailed texture features; second, the extracted local features are refined using the contour convolution module to enrich feature information at the edges of the feature map; finally, the transformer attention module is used to focus on the global pixels, thus suppressing background information and enhancing effective information output. The results show that the overall accuracy (OA), average accuracy, and kappa coefficient of the TAN network can reach 97.71%, 98.44%, and 0.9538, respectively. Compared with several networks, such as two-dimensional CNN, three-dimensional CNN, HybridSN, and CTN, the OA values of the TAN network were improved by 2.59%, 2.45%, 2.94%, and 1.03%, respectively. The results of this study effectively improve the efficiency of the grassland rodent information survey and provide a solid theoretical basis for the investigation and statistics of grassland rodent infestation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.