2012 IEEE International Conference on Automation Science and Engineering (CASE) 2012
DOI: 10.1109/coase.2012.6386466
|View full text |Cite
|
Sign up to set email alerts
|

A robot platform for unmanned weeding in a paddy field using sensor fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…However, they achieved higher accuracy (98.2%) after combining RGB + LIDAR data. Kim et al (2012) could successfully navigate an unmanned weeding robot using sensor fusion of a laser range finder (LRF) and an inertial measurement unit (IMU). The robot also needs more sophisticated and intelligent algorithms to accomplish different subtasks such as sensing, navigation, pathplanning, and control.…”
Section: Sensors and Controllers Fusion Technique Can Improve The Performance Of Robotmentioning
confidence: 99%
“…However, they achieved higher accuracy (98.2%) after combining RGB + LIDAR data. Kim et al (2012) could successfully navigate an unmanned weeding robot using sensor fusion of a laser range finder (LRF) and an inertial measurement unit (IMU). The robot also needs more sophisticated and intelligent algorithms to accomplish different subtasks such as sensing, navigation, pathplanning, and control.…”
Section: Sensors and Controllers Fusion Technique Can Improve The Performance Of Robotmentioning
confidence: 99%
“…In order to remove the weeds by mechanical means, the system should first detect the row that it will work on, and then cut or pluck out the weeds. The precision in row detection recorded with mechanical weeding robots was less than 25 mm [23,24], followed by other systems with precision less than 6 mm, [25,26] or higher, up to 3 cm [27,28]. In addition, the performance of the weed removal (expressed also as weeding efficiency) has been evaluated in systems that presented rates higher than 90% [24,29], and 65% and 82%-for two different presented methods [30].…”
Section: Weeding Robotic Systemsmentioning
confidence: 99%
“…The advancement of imaging technologies has provided a great opportunity to sense and create 2D, 3D, and 4D (spatial + temporal) images of plants [24]. Technologies to obtain 2D, 3D and 4D perception of the environment has been achieved using the following sensors in agricultural fields; visible light, near-infrared, thermal, fluorescence, spectroscopy, structural topography imaging, fluorescence, digital imaging (RGB), multispectral, color infrared, hyperspectral, thermal, spectroradiometer, spectrometer, 3D cameras, moisture, pH, light-reflective, light detection and ranging (LIDAR), sound navigation and ranging (SONAR), ground-penetrating radar and electrical resistance tomography [24][25][26][27][28][29][30] Other sensors, such as potentiometers, inertial, mechanical, ultrasonic, optical encoder, RF receiver, piezoelectric rate, Near Infrared (NIR), laser range finder (LRF), Geomagnetic Direction Sensor (GDS), Fiber Optic Gyroscope (FOG), piezoelectric yaw, pitch and roll rate, acoustic and Inertial Measurement Units (IMUs) have been used to provide direction of the robot and navigation feedback [7,[31][32][33].…”
Section: Agricultural Robot Sensingmentioning
confidence: 99%