2017
DOI: 10.1016/j.compag.2017.09.008
|View full text |Cite
|
Sign up to set email alerts
|

A visual navigation algorithm for paddy field weeding robot based on image understanding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
25
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 70 publications
(28 citation statements)
references
References 15 publications
0
25
0
Order By: Relevance
“…Next, it is necessary to obtain depth information d for the 2D coordinate point (f x , f y ) of the target in the depth image aligned with the colored image and obtain the ratio of the depth pixel to the real unit depth scale. In the end, the conversion from the pixel coordinate system to the camera coordinate system can be directly completed by equation (17). So that, the target 2D coordinate point in the pixel coordinate system can be transformed into the 3D coordinate point in the camera coordinate system.…”
Section: Edge Detection Algorithm Based On Cannymentioning
confidence: 99%
See 1 more Smart Citation
“…Next, it is necessary to obtain depth information d for the 2D coordinate point (f x , f y ) of the target in the depth image aligned with the colored image and obtain the ratio of the depth pixel to the real unit depth scale. In the end, the conversion from the pixel coordinate system to the camera coordinate system can be directly completed by equation (17). So that, the target 2D coordinate point in the pixel coordinate system can be transformed into the 3D coordinate point in the camera coordinate system.…”
Section: Edge Detection Algorithm Based On Cannymentioning
confidence: 99%
“…is method is only used to remove weeds in a certain area. Zhang et al proposed a navigation method for weeding robots based on the smallest univalue segment assimilating nucleus (SUSAN) corner and improved sequential clustering algorithm [17] and does not involve the removal of weeds in paddy fields. Malavaz et al developed a general and robust approach for autonomous robot navigation inside a crop by using light detection and ranging (LiDAR) data [18].…”
Section: Introductionmentioning
confidence: 99%
“…Thus, how to use the images come from the low resolution imaging sensor for the data analysis purpose is a challenge. To conquer these problems above, on one hand, the special materials and process will be used to improve the performance of the imaging sensor; on the other hand, some typical workflows and data fusion algorithm [52] should be designed according to the environment characters in Mars.…”
Section: The Autonomous Navigation For Mars Exploration 31 the Autonmentioning
confidence: 99%
“…Navigation systems are a crucial part of such autonomous robots where a guidance line has to be computed to guide the robot for weed control. Vision sensor-based autonomous guidance systems have been widely researched for extracting the crop lines to guide the robot (Choi et al, 2015; Zhang et al, 2017).…”
Section: Introductionmentioning
confidence: 99%