Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
The geometric feature characterization of fruit trees plays a role in effective management in orchards. LiDAR (light detection and ranging) technology for object detection enables the rapid and precise evaluation of geometric features. This study aimed to quantify the height, canopy volume, tree spacing, and row spacing in an apple orchard using a three-dimensional (3D) LiDAR sensor. A LiDAR sensor was used to collect 3D point cloud data from the apple orchard. Six samples of apple trees, representing a variety of shapes and sizes, were selected for data collection and validation. Commercial software and the python programming language were utilized to process the collected data. The data processing steps involved data conversion, radius outlier removal, voxel grid downsampling, denoising through filtering and erroneous points, segmentation of the region of interest (ROI), clustering using the density-based spatial clustering (DBSCAN) algorithm, data transformation, and the removal of ground points. Accuracy was assessed by comparing the estimated outputs from the point cloud with the corresponding measured values. The sensor-estimated and measured tree heights were 3.05 ± 0.34 m and 3.13 ± 0.33 m, respectively, with a mean absolute error (MAE) of 0.08 m, a root mean squared error (RMSE) of 0.09 m, a linear coefficient of determination (r2) of 0.98, a confidence interval (CI) of −0.14 to −0.02 m, and a high concordance correlation coefficient (CCC) of 0.96, indicating strong agreement and high accuracy. The sensor-estimated and measured canopy volumes were 13.76 ± 2.46 m3 and 14.09 ± 2.10 m3, respectively, with an MAE of 0.57 m3, an RMSE of 0.61 m3, an r2 value of 0.97, and a CI of −0.92 to 0.26, demonstrating high precision. For tree and row spacing, the sensor-estimated distances and measured distances were 3.04 ± 0.17 and 3.18 ± 0.24 m, and 3.35 ± 0.08 and 3.40 ± 0.05 m, respectively, with RMSE and r2 values of 0.12 m and 0.92 for tree spacing, and 0.07 m and 0.94 for row spacing, respectively. The MAE and CI values were 0.09 m, 0.05 m, and −0.18 for tree spacing and 0.01, −0.1, and 0.002 for row spacing, respectively. Although minor differences were observed, the sensor estimates were efficient, though specific measurements require further refinement. The results are based on a limited dataset of six measured values, providing initial insights into geometric feature characterization performance. However, a larger dataset would offer a more reliable accuracy assessment. The small sample size (six apple trees) limits the generalizability of the findings and necessitates caution in interpreting the results. Future studies should incorporate a broader and more diverse dataset to validate and refine the characterization, enhancing management practices in apple orchards.
The geometric feature characterization of fruit trees plays a role in effective management in orchards. LiDAR (light detection and ranging) technology for object detection enables the rapid and precise evaluation of geometric features. This study aimed to quantify the height, canopy volume, tree spacing, and row spacing in an apple orchard using a three-dimensional (3D) LiDAR sensor. A LiDAR sensor was used to collect 3D point cloud data from the apple orchard. Six samples of apple trees, representing a variety of shapes and sizes, were selected for data collection and validation. Commercial software and the python programming language were utilized to process the collected data. The data processing steps involved data conversion, radius outlier removal, voxel grid downsampling, denoising through filtering and erroneous points, segmentation of the region of interest (ROI), clustering using the density-based spatial clustering (DBSCAN) algorithm, data transformation, and the removal of ground points. Accuracy was assessed by comparing the estimated outputs from the point cloud with the corresponding measured values. The sensor-estimated and measured tree heights were 3.05 ± 0.34 m and 3.13 ± 0.33 m, respectively, with a mean absolute error (MAE) of 0.08 m, a root mean squared error (RMSE) of 0.09 m, a linear coefficient of determination (r2) of 0.98, a confidence interval (CI) of −0.14 to −0.02 m, and a high concordance correlation coefficient (CCC) of 0.96, indicating strong agreement and high accuracy. The sensor-estimated and measured canopy volumes were 13.76 ± 2.46 m3 and 14.09 ± 2.10 m3, respectively, with an MAE of 0.57 m3, an RMSE of 0.61 m3, an r2 value of 0.97, and a CI of −0.92 to 0.26, demonstrating high precision. For tree and row spacing, the sensor-estimated distances and measured distances were 3.04 ± 0.17 and 3.18 ± 0.24 m, and 3.35 ± 0.08 and 3.40 ± 0.05 m, respectively, with RMSE and r2 values of 0.12 m and 0.92 for tree spacing, and 0.07 m and 0.94 for row spacing, respectively. The MAE and CI values were 0.09 m, 0.05 m, and −0.18 for tree spacing and 0.01, −0.1, and 0.002 for row spacing, respectively. Although minor differences were observed, the sensor estimates were efficient, though specific measurements require further refinement. The results are based on a limited dataset of six measured values, providing initial insights into geometric feature characterization performance. However, a larger dataset would offer a more reliable accuracy assessment. The small sample size (six apple trees) limits the generalizability of the findings and necessitates caution in interpreting the results. Future studies should incorporate a broader and more diverse dataset to validate and refine the characterization, enhancing management practices in apple orchards.
LiDAR sensors have great potential for enabling crop recognition (e.g., plant height, canopy area, plant spacing, and intra-row spacing measurements) and the recognition of agricultural working environments (e.g., field boundaries, ridges, and obstacles) using agricultural field machinery. The objective of this study was to review the use of LiDAR sensors in the agricultural field for the recognition of crops and agricultural working environments. This study also highlights LiDAR sensor testing procedures, focusing on critical parameters, industry standards, and accuracy benchmarks; it evaluates the specifications of various commercially available LiDAR sensors with applications for plant feature characterization and highlights the importance of mounting LiDAR technology on agricultural machinery for effective recognition of crops and working environments. Different studies have shown promising results of crop feature characterization using an airborne LiDAR, such as coefficient of determination (R2) and root-mean-square error (RMSE) values of 0.97 and 0.05 m for wheat, 0.88 and 5.2 cm for sugar beet, and 0.50 and 12 cm for potato plant height estimation, respectively. A relative error of 11.83% was observed between sensor and manual measurements, with the highest distribution correlation at 0.675 and an average relative error of 5.14% during soybean canopy estimation using LiDAR. An object detection accuracy of 100% was found for plant identification using three LiDAR scanning methods: center of the cluster, lowest point, and stem–ground intersection. LiDAR was also shown to effectively detect ridges, field boundaries, and obstacles, which is necessary for precision agriculture and autonomous agricultural machinery navigation. Future directions for LiDAR applications in agriculture emphasize the need for continuous advancements in sensor technology, along with the integration of complementary systems and algorithms, such as machine learning, to improve performance and accuracy in agricultural field applications. A strategic framework for implementing LiDAR technology in agriculture includes recommendations for precise testing, solutions for current limitations, and guidance on integrating LiDAR with other technologies to enhance digital agriculture.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.