2020
DOI: 10.1007/s10514-020-09915-y
|View full text |Cite
|
Sign up to set email alerts
|

High precision control and deep learning-based corn stand counting algorithms for agricultural robot

Abstract: This paper presents high precision control and deep learning-based corn stand counting algorithms for a lowcost, ultra-compact 3D printed and autonomous field robot for agricultural operations. Currently, plant traits, such as emergence rate, biomass, vigor, and stand counting, are measured manually. This is highly labor-intensive and prone to errors. The robot, termed TerraSentia, is designed to automate the measurement of plant traits for efficient phenotyping as an alternative to manual measurements. In thi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 53 publications
(20 citation statements)
references
References 39 publications
0
19
0
1
Order By: Relevance
“…Different techniques such as 3D reconstruction, image processing, and machine learning are used for data analysis and quantify morphological traits. Existing UGV robotic systems have been employed to measure plant height, plant orientation, leaf angle, leaf area, leaf length, leaf and stem width, and stalk count of various species such as maize, and sorghum, sunflower, savoy cabbage, cauliflower, and Brussels sprout (Jay et al, 2015;Fernandez et al, 2017;Baweja et al, 2018;Choudhuri and Chowdhary, 2018;Vázquez-Arellano et al, 2018;Vijayarangan et al, 2018;Bao et al, 2019b;Breitzman et al, 2019;Qiu et al, 2019;Young et al, 2019;Zhang et al, 2020), count the cotton bolls (Xu et al, 2018), architectural traits and density of peanut canopy (Yuan et al, 2018), berry size and color of grape (Kicherer et al, 2015), and shape, volume, and yield estimation of vineyard (Lopes et al, 2016;Vidoni et al, 2017). A compact and autonomous TerraSentia rover equipped with three RGB cameras and a LIDAR was demonstrated to acquire in-field LIDAR scans of maize plants to extract their Latent Space Phenotypes (LSPs) (Gage et al, 2019).…”
Section: Review: Many Indoor and Outdoor Robots Were Developed To Measure A Wide Range Of Plant Traitsmentioning
confidence: 99%
See 1 more Smart Citation
“…Different techniques such as 3D reconstruction, image processing, and machine learning are used for data analysis and quantify morphological traits. Existing UGV robotic systems have been employed to measure plant height, plant orientation, leaf angle, leaf area, leaf length, leaf and stem width, and stalk count of various species such as maize, and sorghum, sunflower, savoy cabbage, cauliflower, and Brussels sprout (Jay et al, 2015;Fernandez et al, 2017;Baweja et al, 2018;Choudhuri and Chowdhary, 2018;Vázquez-Arellano et al, 2018;Vijayarangan et al, 2018;Bao et al, 2019b;Breitzman et al, 2019;Qiu et al, 2019;Young et al, 2019;Zhang et al, 2020), count the cotton bolls (Xu et al, 2018), architectural traits and density of peanut canopy (Yuan et al, 2018), berry size and color of grape (Kicherer et al, 2015), and shape, volume, and yield estimation of vineyard (Lopes et al, 2016;Vidoni et al, 2017). A compact and autonomous TerraSentia rover equipped with three RGB cameras and a LIDAR was demonstrated to acquire in-field LIDAR scans of maize plants to extract their Latent Space Phenotypes (LSPs) (Gage et al, 2019).…”
Section: Review: Many Indoor and Outdoor Robots Were Developed To Measure A Wide Range Of Plant Traitsmentioning
confidence: 99%
“…The design of the gripper and DOF of the robotic manipulator should allow a good and gentle contact between the sensing unit and the leaf/stem. Sometimes a vacuum mechanism attached to a soft gripper can hold the leaf/stem and help the sensing unit for effective contact and collect accurate data with less damage to the plant organs (Hayashi et al, 2010;Hughes et al, 2016;Zhang et al, 2020). Moreover, autonomous robots should gather data with minimum error (high signal to noise ratio).…”
Section: Perspective Applications Of Robotic Phenotypingmentioning
confidence: 99%
“…To test the ability of PAAD to alert the robot before executing an anomalous behavior, we further perform a realtime anomaly detection task on additional data 3 . In this experiment, the robot was driven by the vision-based navigation algorithm [5] on 1.3 km of field trails, consisting of 750m of common field environment and 550m of densely weedy environment.…”
Section: B Real-time Testmentioning
confidence: 99%
“…Phenobot is a robot for sorghum plant phenotyping (Bao et al, 2019). TerraSentia is a low-cost, three-dimensional (3D) printed field robot that can count corn stands using deep learning methods (Zhang et al, 2020). An autonomous mobile robot was developed for plant phenotyping using a LiDAR sensor and soil sensing with a multipurpose toolhead on a robotic arm (Iqbal et al, 2020a(Iqbal et al, , 2020b.…”
Section: Introductionmentioning
confidence: 99%