Advancements in the use of genome‐wide markers have provided unprecedented opportunities for dissecting the genetic components that control phenotypic trait variation. However, cost‐effectively characterizing agronomically important phenotypic traits on a large scale remains a bottleneck. Unmanned aerial vehicle (UAV)‐based high‐throughput phenotyping has recently become a prominent method, as it allows large numbers of plants to be analyzed in a time‐series manner. In this experiment, 233 inbred lines from the maize (Zea mays L.) diversity panel were grown in the field under different nitrogen treatments. Unmanned aerial vehicle images were collected during different plant developmental stages throughout the growing season. A workflow for extracting plot‐level images, filtering images to remove nonfoliage elements, and calculating canopy coverage and greenness ratings based on vegetation indices (VIs) was developed. After applying the workflow, about 100,000 plot‐level image clips were obtained for 12 different time points. High correlations were detected between VIs and ground truth physiological and yield‐related traits. The genome‐wide association study was performed, resulting in n = 29 unique genomic regions associated with image extracted traits from two or more of the 12 total time points. A candidate gene Zm00001d031997, a maize homolog of the Arabidopsis HCF244 (high chlorophyll fluorescence 244), located underneath the leading single nucleotide polymorphisms of the canopy coverage associated signals were repeatedly detected under both nitrogen conditions. The plot‐level time‐series phenotypic data and the trait‐associated genes provide great opportunities to advance plant science and to facilitate plant breeding.
Automatically scoring plant traits using a combination of imaging and deep learning holds promise to accelerate data collection, scientific inquiry, and breeding progress. However, applications of this approach are currently held back by the availability of large and suitably annotated training datasets. Early training datasets targeted arabidopsis or tobacco. The morphology of these plants quite different from that of grass species like maize. Two sets of maize training data, one real-world and one synthetic were generated and annotated for late vegetative stage maize plants using leaf count as a model trait. Convolutional neural networks (CNNs) trained on entirely synthetic data provided predictive power for scoring leaf number in real-world images. This power was less than CNNs trained with equal numbers of real-world images, however, in some cases CNNs trained with larger numbers of synthetic images outperformed CNNs trained with smaller numbers of real-world images. When real-world training images were scarce, augmenting real-world training data with synthetic data provided improved prediction accuracy. Quantifying leaf number over time can provide insight into plant growth rates and stress responses, and can help to parameterize crop growth models. The approaches and annotated training data described here may help future efforts to develop accurate leaf counting algorithms for maize.4. Partially or completely overlapping leaves from the perspective of the observer ( Figure 2D).
Advancements in the use of genome-wide markers have provided new opportunities for dissecting the genetic components that control phenotypic trait variation. However, cost-effectively characterizing agronomically important phenotypic traits on a large scale remains a bottleneck. Unmanned aerial vehicle (UAV)-based high-throughput phenotyping has recently become a prominent method, as it allows large numbers of plants to be analyzed in a time-series manner. In this experiment, 233 inbred lines from the maize diversity panel were grown in a replicated incomplete block under both nitrogen-limited conditions and following conventional agronomic practices. UAV images were collected during different plant developmental stages throughout the growing season. A pipeline for extracting plot-level images, filtering images to remove non-foliage elements, and calculating canopy coverage and greenness ratings based on vegetation indices (VIs) was developed. After applying the pipeline, about half a million plot-level image clips were obtained for 12 different time points. High correlations were detected between VIs and ground truth physiological and yield-related traits collected from the same plots, i.e., Vegetative Index (VEG) vs. leaf nitrogen levels (Pearson correlation coefficient, R = 0.73), Woebbecke index vs. leaf area (R = -0.52), and Visible Atmospherically Resistant Index (VARI) vs. 20 kernel weight --- a yield component trait (R = 0.40). The genome-wide association study was performed using canopy coverage and each of the VIs at each date, resulting in N = 29 unique genomic regions associated with image extracted traits from three or more of the 12 total time points. A candidate gene Zm00001d031997, a maize homolog of the Arabidopsis HCF244 (high chlorophyll fluorescence 244), located underneath the leading SNPs of the canopy coverage associated signals that were repeatedly detected under both nitrogen conditions. The plot-level time-series phenotypic data and the trait-associated genes provide great opportunities to advance plant science and to facilitate plant breeding.
Unmanned aerial vehicle (UAV)-based imagery has become widely used in collecting agronomic traits, enabling a much greater volume of data to be generated in a time-series manner. As one of the cuttingedge imagery analysis tools, machine learning-based object detection provides automated techniques to analyze these imagery data. In our previous study, UAVs have been used to collect aerial photography for field trials of 233 diverse inbred lines, grown under different nitrogen treatments. Images were collected during different plant developmental stages throughout the growing season. This dataset of images has here been used in developing machine learning techniques to obtain automated tassel counts at the plot level through the season. To improve detection accuracy, we have developed an image segmentation method to remove non-tassel pixels and then feed these filtered images into machine learning algorithms. As a result, our method showed a significant improvement in the accuracy of maize tassel detection. This method can be used in future research to produce time-series counts of tassels at the plot level, and will allow for accurate estimates of flowering-related traits, such as the earliest detected flowering date and the duration of each plot's flowering period. This phenotypic data and the traitassociated genes provide new opportunities for crop improvement and to facilitate future plant breeding.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.