Auxin is a key regulator of plant growth and development. Within the root tip, auxin distribution plays a crucial role specifying developmental zones and coordinating tropic responses. Determining how the organ-scale auxin pattern is regulated at the cellular scale is essential to understanding how these processes are controlled. In this study, we developed an auxin transport model based on actual root cell geometries and carrier subcellular localizations. We tested model predictions using the DII-VENUS auxin sensor in conjunction with state-of-the-art segmentation tools. Our study revealed that auxin efflux carriers alone cannot create the pattern of auxin distribution at the root tip and that AUX1/LAX influx carriers are also required. We observed that AUX1 in lateral root cap (LRC) and elongating epidermal cells greatly enhance auxin's shootward flux, with this flux being predominantly through the LRC, entering the epidermal cells only as they enter the elongation zone. We conclude that the nonpolar AUX1/LAX influx carriers control which tissues have high auxin levels, whereas the polar PIN carriers control the direction of auxin transport within these tissues.
Major increases in crop yield are required to keep pace with population growth and climate change. Improvements to the architecture of crop roots promise to deliver increases in water and nutrient use efficiency but profiling the root phenome (i.e. its structure and function) represents a major bottleneck. We describe how advances in imaging and sensor technologies are making root phenomic studies possible. However, methodological advances in acquisition, handling and processing of the resulting 'big-data' is becoming increasingly important. Advances in automated image analysis approaches such as Deep Learning promise to transform the root phenotyping landscape. Collectively, these innovations are helping drive the selection of the next-generation of crops to deliver real world impact for ongoing global food security efforts.
HighlightA phenotyping pipeline was used to quantify seedling root architectural traits in a wheat double haploid mapping population. QTL analyses revealed a potential major effect gene regulating seedling root vigour/growth.
We present a novel image analysis tool that allows the semiautomated quantification of complex root system architectures in a range of plant species grown and imaged in a variety of ways. The automatic component of RootNav takes a top-down approach, utilizing the powerful expectation maximization classification algorithm to examine regions of the input image, calculating the likelihood that given pixels correspond to roots. This information is used as the basis for an optimization approach to root detection and quantification, which effectively fits a root model to the image data. The resulting user experience is akin to defining routes on a motorist's satellite navigation system: RootNav makes an initial optimized estimate of paths from the seed point to root apices, and the user is able to easily and intuitively refine the results using a visual approach. The proposed method is evaluated on winter wheat (Triticum aestivum) images (and demonstrated on Arabidopsis [Arabidopsis thaliana], Brassica napus, and rice [Oryza sativa]), and results are compared with manual analysis. Four exemplar traits are calculated and show clear illustrative differences between some of the wheat accessions. RootNav, however, provides the structural information needed to support extraction of a wider variety of biologically relevant measures. A separate viewer tool is provided to recover a rich set of architectural traits from RootNav's core representation.
In plant phenotyping, it has become important to be able to measure many features on large image sets in order to aid genetic discovery. The size of the datasets, now often captured robotically, often precludes manual inspection, hence the motivation for finding a fully automated approach. Deep learning is an emerging field that promises unparalleled results on many data analysis problems. Building on artificial neural networks, deep approaches have many more hidden layers in the network, and hence have greater discriminative and predictive power. We demonstrate the use of such approaches as part of a plant phenotyping pipeline. We show the success offered by such techniques when applied to the challenging problem of image-based plant phenotyping and demonstrate state-of-the-art results (>97% accuracy) for root and shoot feature identification and localization. We use fully automated trait identification using deep learning to identify quantitative trait loci in root architecture datasets. The majority (12 out of 14) of manually identified quantitative trait loci were also discovered using our automated approach based on deep learning detection to locate plant features. We have shown deep learning–based phenotyping to have very good detection and localization accuracy in validation and testing image sets. We have shown that such features can be used to derive meaningful biological traits, which in turn can be used in quantitative trait loci discovery pipelines. This process can be completely automated. We predict a paradigm shift in image-based phenotyping bought about by such deep learning approaches, given sufficient training sets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.