Crop leaf purpling is a common phenotypic change when plants are subject to some biotic and abiotic stresses during their growth. The extraction of purple leaves can monitor crop stresses as an apparent trait and meanwhile contributes to crop phenotype analysis, monitoring, and yield estimation. Due to the complexity of the field environment as well as differences in size, shape, texture, and color gradation among the leaves, purple leaf segmentation is difficult. In this study, we used a U-Net model for segmenting purple rapeseed leaves during the seedling stage based on unmanned aerial vehicle (UAV) RGB imagery at the pixel level. With the limited spatial resolution of rapeseed images acquired by UAV and small object size, the input patch size was carefully selected. Experiments showed that the U-Net model with the patch size of 256 × 256 pixels obtained better and more stable results with a F-measure of 90.29% and an Intersection of Union (IoU) of 82.41%. To further explore the influence of image spatial resolution, we evaluated the performance of the U-Net model with different image resolutions and patch sizes. The U-Net model performed better compared with four other commonly used image segmentation approaches comprising support vector machine, random forest, HSeg, and SegNet. Moreover, regression analysis was performed between the purple rapeseed leaf ratios and the measured N content. The negative exponential model had a coefficient of determination (R²) of 0.858, thereby explaining much of the rapeseed leaf purpling in this study. This purple leaf phenotype could be an auxiliary means for monitoring crop growth status so that crops could be managed in a timely and effective manner when nitrogen stress occurs. Results demonstrate that the U-Net model is a robust method for purple rapeseed leaf segmentation and that the accurate segmentation of purple leaves provides a new method for crop nitrogen stress monitoring.
Rapeseed is an important oil crop in China. Timely estimation of rapeseed stand count at early growth stages provides useful information for precision fertilization, irrigation, and yield prediction. Based on the nature of rapeseed, the number of tillering leaves is strongly related to its growth stages. However, no field study has been reported on estimating rapeseed stand count by the number of leaves recognized with convolutional neural networks (CNNs) in unmanned aerial vehicle (UAV) imagery. The objectives of this study were to provide a case for rapeseed stand counting with reference to the existing knowledge of the number of leaves per plant and to determine the optimal timing for counting after rapeseed emergence at leaf development stages with one to seven leaves. A CNN model was developed to recognize leaves in UAV-based imagery, and rapeseed stand count was estimated with the number of recognized leaves. The performance of leaf detection was compared using sample sizes of 16, 24, 32, 40, and 48 pixels. Leaf overcounting occurred when a leaf was much bigger than others as this bigger leaf was recognized as several smaller leaves. Results showed CNN-based leaf count achieved the best performance at the four-to six-leaf stage with F-scores greater than 90% after calibration with overcounting rate. On average, 806 out of 812 plants were correctly estimated on 53 days after planting (DAP) at the four-to sixleaf stage, which was considered as the optimal observation timing. For the 32-pixel patch size, root mean square error (RMSE) was 9 plants with relative RMSE (rRMSE) of 2.22% on 53 DAP, while the mean RMSE was 12 with mean rRMSE of 2.89% for all patch sizes. A sample size of 32 pixels was suggested to be optimal accounting for balancing performance and efficiency. The results of this study confirmed that it was
The spatial resolution of in situ unmanned aerial vehicle (UAV) multispectral images has a crucial effect on crop growth monitoring and image acquisition efficiency. However, existing studies about optimal spatial resolution for crop monitoring are mainly based on resampled images. Therefore, the resampled spatial resolution in these studies might not be applicable to in situ UAV images. In order to obtain optimal spatial resolution of in situ UAV multispectral images for crop growth monitoring, a RedEdge Micasense 3 camera was installed onto a DJI M600 UAV flying at different heights of 22, 29, 44, 88, and 176m to capture images of seedling rapeseed with ground sampling distances (GSD) of 1.35, 1.69, 2.61, 5.73, and 11.61 cm, respectively. Meanwhile, the normalized difference vegetation index (NDVI) measured by a GreenSeeker (GS-NDVI) and leaf area index (LAI) were collected to evaluate the performance of nine vegetation indices (VIs) and VI*plant height (PH) at different GSDs for rapeseed growth monitoring. The results showed that the normalized difference red edge index (NDRE) had a better performance for estimating GS-NDVI (R2 = 0.812) and LAI (R2 = 0.717), compared with other VIs. Moreover, when GSD was less than 2.61 cm, the NDRE*PH derived from in situ UAV images outperformed the NDRE for LAI estimation (R2 = 0.757). At oversized GSD (≥5.73 cm), imprecise PH information and a large heterogeneity within the pixel (revealed by semi-variogram analysis) resulted in a large random error for LAI estimation by NDRE*PH. Furthermore, the image collection and processing time at 1.35 cm GSD was about three times as long as that at 2.61 cm. The result of this study suggested that NDRE*PH from UAV multispectral images with a spatial resolution around 2.61 cm could be a preferential selection for seedling rapeseed growth monitoring, while NDRE alone might have a better performance for low spatial resolution images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.