Phenotyping involves the quantitative assessment of the anatomical, biochemical, and physiological plant traits. Natural plant growth cycles can be extremely slow, hindering the experimental processes of phenotyping. Deep learning offers a great deal of support for automating and addressing key plant phenotyping research issues. Machine learning-based high-throughput phenotyping is a potential solution to the phenotyping bottleneck, promising to accelerate the experimental cycles within phenomic research. This research presents a study of deep networks’ potential to predict plants’ expected growth, by generating segmentation masks of root and shoot systems into the future. We adapt an existing generative adversarial predictive network into this new domain. The results show an efficient plant leaf and root segmentation network that provides predictive segmentation of what a leaf and root system will look like at a future time, based on time-series data of plant growth. We present benchmark results on two public datasets of Arabidopsis (A. thaliana) and Brassica rapa (Komatsuna) plants. The experimental results show strong performance, and the capability of proposed methods to match expert annotation. The proposed method is highly adaptable, trainable (transfer learning/domain adaptation) on different plant species and mutations.
Photogrammetry systems are used extensively as volumetric measurement tools in a diverse range of applications including gait analysis, robotics and computer generated animation. For precision applications the spatial inaccuracies of these systems are of interest. In this paper, an experimental characterisation of a six camera Vicon T160 photogrammetry system using a high accuracy laser tracker is presented. The study was motivated by empirical observations of the accuracy of the photogrammetry system varying as a function of location within a measurement volume of approximately 100 m3. Error quantification was implemented through simultaneously tracking a target scanned through a sub-volume (27 m3) using both systems. The position of the target was measured at each point of a grid in four planes at different heights. In addition, the effect of the use of passive and active calibration artefacts upon system accuracy was investigated. A convex surface was obtained when considering error as a function of position for a fixed height setting confirming the empirical observations when using either calibration artefact. Average errors of 1.48 mm and 3.95 mm were obtained for the active and passive calibration artefacts respectively. However, it was found that through estimating and applying an unknown scale factor relating measurements, the overall accuracy could be improved with average errors reducing to 0.51 mm and 0.59 mm for the active and passive datasets respectively. The precision in the measurements was found to be less than 10 μm for each axis
The RE@CT project set out to revolutionise the production of realistic 3D characters for game-like applications and interactive video productions, and significantly reduce costs by developing an automated process to extract and represent animated characters from actor performance captured in a multi-camera studio. The key innovation is the development of methods for analysis and representation of 3D video to allow reuse for real-time interactive animation. This enables efficient authoring of interactive characters with video quality appearance and motion
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.