This paper presents a novel framework to extract metro tunnel cross sections (profiles) from Terrestrial Laser Scanning point clouds. The entire framework consists of two steps: tunnel central axis extraction and cross section determination. In tunnel central extraction, we propose a slice-based method to obtain an initial central axis, which is further divided into linear and nonlinear circular segments by an enhanced Random Sample Consensus (RANSAC) tunnel axis segmentation algorithm. This algorithm transforms the problem of hybrid linear and nonlinear segment extraction into a sole segmentation of linear elements defined at the tangent space rather than raw data space, significantly simplifying the tunnel axis segmentation. The extracted axis segments are then provided as input to the step of the cross section determination which generates the coarse cross-sectional points by intersecting a series of straight lines that rotate orthogonally around the tunnel axis with their local fitted quadric surface, i.e., cylindrical surface. These generated profile points are further refined and densified via solving a constrained nonlinear least squares problem. Our experiments on Nanjing metro tunnel show that the cross sectional fitting error is only 1.69 mm. Compared with the designed radius of the metro tunnel, the RMSE (Root Mean Square Error) of extracted cross sections’ radii only keeps 1.60 mm. We also test our algorithm on another metro tunnel in Shanghai, and the results show that the RMSE of radii only keeps 4.60 mm which is superior to a state-of-the-art method of 6.00 mm. Apart from the accurate geometry, our approach can maintain the correct topology among cross sections, thereby guaranteeing the production of geometric tunnel model without crack defects. Moreover, we prove that our algorithm is insensitive to the missing data and point density.
To date, unmanned aerial vehicles (UAVs), commonly known as drones, have been widely used in precision agriculture (PA) for crop monitoring and crop spraying, allowing farmers to increase the efficiency of the farming process, meanwhile reducing environmental impact. However, to spray pesticides effectively and safely to the trees in small fields or rugged environments, such as mountain areas, is still an open question. To bridge this gap, in this study, an onboard computer vision (CV) component for UAVs is developed. The system is low-cost, flexible, and energy-effective. It consists of two parts, the hardware part is an Intel Neural Compute Stick 2 (NCS2), and the software part is an object detection algorithm named the Ag-YOLO. The NCS2 is 18 grams in weight, 1.5 watts in energy consumption, and costs about $66. The proposed model Ag-YOLO is inspired by You Only Look Once (YOLO), trained and tested with aerial images of areca plantations, and shows high accuracy (F1 score = 0.9205) and high speed [36.5 frames per second (fps)] on the target hardware. Compared to YOLOv3-Tiny, Ag-YOLO is 2× faster while using 12× fewer parameters. Based on this study, crop monitoring and crop spraying can be synchronized into one process, so that smart and precise spraying can be performed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.