The reliable and efficient large-scale mapping of date palm trees from remotely sensed data is crucial for developing palm tree inventories, continuous monitoring, vulnerability assessments, environmental control, and long-term management. Given the increasing availability of UAV images with limited spectral information, the high intra-class variance of date palm trees, the variations in the spatial resolutions of the data, and the differences in image contexts and backgrounds, accurate mapping of date palm trees from very-high spatial resolution (VHSR) images can be challenging. This study aimed to investigate the reliability and the efficiency of various deep vision transformers in extracting date palm trees from multiscale and multisource VHSR images. Numerous vision transformers, including the Segformer, the Segmenter, the UperNet-Swin transformer, and the dense prediction transformer, with various levels of model complexity, were evaluated. The models were developed and evaluated using a set of comprehensive UAV-based and aerial images. The generalizability and the transferability of the deep vision transformers were evaluated and compared with various convolutional neural network-based (CNN) semantic segmentation models (including DeepLabV3+, PSPNet, FCN-ResNet-50, and DANet). The results of the examined deep vision transformers were generally comparable to several CNN-based models. The investigated deep vision transformers achieved satisfactory results in mapping date palm trees from the UAV images, with an mIoU ranging from 85% to 86.3% and an mF-score ranging from 91.62% to 92.44%. Among the evaluated models, the Segformer generated the highest segmentation results on the UAV-based and the multiscale testing datasets. The Segformer model, followed by the UperNet-Swin transformer, outperformed all of the evaluated CNN-based models in the multiscale testing dataset and in the additional unseen UAV testing dataset. In addition to delivering remarkable results in mapping date palm trees from versatile VHSR images, the Segformer model was among those with a small number of parameters and relatively low computing costs. Collectively, deep vision transformers could be used efficiently in developing and updating inventories of date palms and other tree species.
Landslide impact is potentially hazardous to an urban environment. Landslides occur at certain slope levels over time and require practical slope analysis to assess the nature of the slope where a landslide is likely to occur. Thus, acquiring very high-resolution remote sensing data plays a significant role in determining the slope surface. For this study, 12 landslide conditioning parameters with 10 × 10 cell sizes that have never been previously collectively applied were created. These factors were created directly from the LiDAR (Light Detection and Ranging) DEM (digital elevation model)using their layer toolboxes, which include slope, aspect, elevation, curvature, and hill shade. Stream power index (SPI), topographic wetness index (TWI), and terrain roughness index (TRI) were created from spatial layers such as slope, flow direction, and flow accumulation. Shapefiles of distances to roads, lakes, trees, and build-up were digitized as land use/cover from the LiDAR image and produced using the Euclidean distance method in ArcGIS. The parameters were selected based on expert knowledge, previous landslide literature, and the study area characteristics. Moreover, multicriteria decision-making analysis, which includes the analytic hierarchy process (AHP) and fuzzy logic approaches not previously utilized with a LiDAR DEM, was used in this study to predict the possibility of a landslide. The receiver operating characteristics (ROC) were used for the validation of results. The area under the curve (AUC) values obtained from the ROC method for the AHP and fuzzy were 0.859 and 0.802, respectively. The final susceptibility results will be helpful to urban developers in Malaysia and for sustainable landslide hazard mitigation.
The Rub’ al Khali desert (or Empty Quarter) is the largest and perhaps most significant sand sea in the world. Located on the southern Arabian Peninsula, the dune field has remained largely unexplored owing to the harsh clime and difficult terrain. This study takes advantage of geospatial technology (interpolations, supervised classification, minimum focal statistic) to extract information from the data contained in global Digital Elevation Model (DEM)s, satellite imagery. The main objective here is to identify and map different dune forms within the sand sea, estimate the volume of sand and explore probable sources of sand. The analysis of dune color strongly suggests that the sand is not completely reworked and intermixed. If this is true, a spatial variability map of the mineral composition of the sand could be very revealing. The red sand is quite pronounced, the largest volume of sand (~36%) is associated with the yellow color class. Yellow sand covers most of the western part of the dunes field and seems to be a transitional color between red and white sand in the eastern part of the dune field. This suggests that the yellow sand might be derived from both local and regional sources, or it might be less oxidized, reworked, or have a different composition that represents a combination of red and white sand.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.