Background Aboveground biomass (AGB) is a widely used agronomic parameter for characterizing crop growth status and predicting grain yield. The rapid and accurate estimation of AGB in a non-destructive way is useful for making informed decisions on precision crop management. Previous studies have investigated vegetation indices (VIs) and canopy height metrics derived from Unmanned Aerial Vehicle (UAV) data to estimate the AGB of various crops. However, the input variables were derived either from one type of data or from different sensors on board UAVs. Whether the combination of VIs and canopy height metrics derived from a single low-cost UAV system can improve the AGB estimation accuracy remains unclear. This study used a low-cost UAV system to acquire imagery at 30 m flight altitude at critical growth stages of wheat in Rugao of eastern China. The experiments were conducted in 2016 and 2017 and involved 36 field plots representing variations in cultivar, nitrogen fertilization level and sowing density. We evaluated the performance of VIs, canopy height metrics and their combination for AGB estimation in wheat with the stepwise multiple linear regression (SMLR) and three types of machine learning algorithms (support vector regression, SVR; extreme learning machine, ELM; random forest, RF). Results Our results demonstrated that the combination of VIs and canopy height metrics improved the estimation accuracy for AGB of wheat over the use of VIs or canopy height metrics alone. Specifically, RF performed the best among the SMLR and three machine learning algorithms regardless of using all the original variables or selected variables by the SMLR. The best accuracy ( R 2 = 0.78, RMSE = 1.34 t/ha, rRMSE = 28.98%) was obtained when applying RF to the combination of VIs and canopy height metrics. Conclusions Our findings implied that an inexpensive approach consisting of the RF algorithm and the combination of RGB imagery and point cloud data derived from a low-cost UAV system at the consumer-grade level can be used to improve the accuracy of AGB estimation and have potential in the practical applications in the rapid estimation of other growth parameters.
Leaf area index (LAI) is a fundamental indicator of plant growth status in agronomic and environmental studies. Due to rapid advances in unmanned aerial vehicle (UAV) and sensor technologies, UAV-based remote sensing is emerging as a promising solution for monitoring crop LAI with great flexibility and applicability. This study aimed to determine the feasibility of combining color and texture information derived from UAV-based digital images for estimating LAI of rice (Oryza sativa L.). Rice field trials were conducted at two sites using different nitrogen application rates, varieties, and transplanting methods during 2016 to 2017. Digital images were collected using a consumer-grade UAV after sampling at key growth stages of tillering, stem elongation, panicle initiation and booting. Vegetation color indices (CIs) and grey level co-occurrence matrix-based textures were extracted from mosaicked UAV ortho-images for each plot. As a solution of using indices composed by two different textures, normalized difference texture indices (NDTIs) were calculated by two randomly selected textures. The relationships between rice LAIs and each calculated index were then compared using simple linear regression. Multivariate regression models with different input sets were further used to test the potential of combining CIs with various textures for rice LAI estimation. The results revealed that the visible atmospherically resistant index (VARI) based on three visible bands and the NDTI based on the mean textures derived from the red and green bands were the best for LAI retrieval in the CI and NDTI groups, respectively. Independent accuracy assessment showed that random forest (RF) exhibited the best predictive performance when combining CI and texture inputs (R2 = 0.84, RMSE = 0.87, MAE = 0.69). This study introduces a promising solution of combining color indices and textures from UAV-based digital imagery for rice LAI estimation. Future studies are needed on finding the best operation mode, suitable ground resolution, and optimal predictive methods for practical applications.
Unmanned aerial system (UAS)-based remote sensing is one promising technique for precision crop management, but few studies have reported the applications of such systems on nitrogen (N) estimation with multiple sensors in rice (Oryza sativa L.). This study aims to evaluate three sensors (RGB, color-infrared (CIR) and multispectral (MS) cameras) onboard UAS for the estimation of N status at individual stages and their combination with the field data collected from a two-year rice experiment. The experiments were conducted in 2015 and 2016, involving different N rates, planting densities and rice cultivars, with three replicates. An Oktokopter UAS was used to acquire aerial photography at early growth stages (from tillering to booting) and field samplings were taken at a near date. Two color indices (normalized excess green index (NExG), and normalized green red difference index (NGRDI)), two near infrared vegetation indices (green normalized difference vegetation index (GNDVI), and enhanced NDVI (ENDVI)) and two red edge vegetation indices (red edge chlorophyll index (CI red edge ), and DATT) were used to evaluate the capability of these three sensors in estimating leaf nitrogen accumulation (LNA) and plant nitrogen accumulation (PNA) in rice. The results demonstrated that the red edge vegetation indices derived from MS images produced the highest estimation accuracy for LNA (R 2 : 0.79-0.81, root mean squared error (RMSE): 1.43-1.45 g m −2 ) and PNA (R 2 : 0.81-0.84, RMSE: 2.27-2.38 g m −2 ). The GNDVI from CIR images yielded a moderate estimation accuracy with an all-stage model. Color indices from RGB images exhibited satisfactory performance for the pooled dataset of the tillering and jointing stages. Compared with the counterpart indices from the RGB and CIR images, the indices from the MS images performed better in most cases. These results may set strong foundations for the development of UAS-based rice growth monitoring systems, providing useful information for the real-time decision making on crop N management.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.