The plant factory is a form of controlled environment agriculture (CEA) which is offers a promising solution to the problem of food security worldwide. Plant growth parameters need to be acquired for process control and yield estimation in plant factories. In this paper, we propose a fast and non-destructive framework for extracting growth parameters. Firstly, ToF camera (Microsoft Kinect V2) is used to obtain the point cloud from the top view, and then the lettuce point cloud is separated. According to the growth characteristics of lettuce, a geometric method is proposed to complete the incomplete lettuce point cloud. The treated point cloud has a high linear correlation with the actual plant height (R2 = 0.961), leaf area (R2 = 0.964), and fresh weight (R2 = 0.911) with a significant improvement compared to untreated point cloud. The result suggests our proposed point cloud completion method have has the potential to tackle the problem of obtaining the plant growth parameters from a single 3D view with occlusion.
Stereo matching is a depth perception method for plant phenotyping with high throughput. In recent years, the accuracy and real-time performance of the stereo matching models have been greatly improved. While the training process relies on specialized large-scale datasets, in this research, we aim to address the issue in building stereo matching datasets. A semi-automatic method was proposed to acquire the ground truth, including camera calibration, image registration, and disparity image generation. On the basis of this method, spinach, tomato, pepper, and pumpkin were considered for experiment, and a dataset named PlantStereo was built for reconstruction. Taking data size, disparity accuracy, disparity density, and data type into consideration, PlantStereo outperforms other representative stereo matching datasets. Experimental results showed that, compared with the disparity accuracy at pixel level, the disparity accuracy at sub-pixel level can remarkably improve the matching accuracy. More specifically, for PSMNet, the EPE and bad-3 error decreased 0.30 pixels and 2.13%, respectively. For GwcNet, the EPE and bad-3 error decreased 0.08 pixels and 0.42%, respectively. In addition, the proposed workflow based on stereo matching can achieve competitive results compared with other depth perception methods, such as Time-of-Flight (ToF) and structured light, when considering depth error (2.5 mm at 0.7 m), real-time performance (50 fps at 1046 × 606), and cost. The proposed method can be adopted to build stereo matching datasets, and the workflow can be used for depth perception in plant phenotyping.
Stereo matching is an important task in computer vision which has drawn tremendous research attention for decades. While in terms of disparity accuracy, density and data size, public stereo datasets are difficult to meet the requirements of models. In this paper, we aim to address the issue between datasets and models and propose a large scale stereo dataset with high accuracy disparity ground truth named PlantStereo. We used a semi-automatic way to construct the dataset: after camera calibration and image registration, high accuracy disparity images can be obtained from the depth images. In total, PlantStereo contains 812 image pairs covering a diverse set of plants: spinach, tomato, pepper and pumpkin. We firstly evaluated our PlantStereo dataset on four different stereo matching methods. Extensive experiments on different models and plants show that compared with ground truth in integer accuracy, high accuracy disparity images provided by PlantStereo can remarkably improve the training effect of deep learning models. This paper provided a feasible and reliable method to realize plant surface dense reconstruction. The PlantStereo dataset and relative code are available at: https://www.github.com/wangqingyu985/PlantStereo
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.