2021
DOI: 10.1109/jsen.2020.3016081
|View full text |Cite
|
Sign up to set email alerts
|

Augmented Perception for Agricultural Robots Navigation

Abstract: Producing food in a sustainable way is becoming very challenging today due to the lack of skilled labor, the unaffordable costs of labor when available, and the limited returns for growers as a result of low produce prices demanded by big supermarket chains in contrast to ever-increasing costs of inputs such as fuel, chemicals, seeds, or water. Robotics emerges as a technological advance that can counterweight some of these challenges, mainly in industrialized countries. However, the deployment of autonomous m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
23
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4
2

Relationship

2
8

Authors

Journals

citations
Cited by 46 publications
(23 citation statements)
references
References 15 publications
0
23
0
Order By: Relevance
“…Figure 5 a illustrates this phenomenon for several rows with a length longer than 100 m, when the robot of Section 3 followed a regular pattern along parallel rows. The plot in Figure 5 a compares the heading values read directly from the VTG NMEA string of a GPS receiver previously validated in the field [ 23 ], with the heading measured in real time with an electronic compass fixed to the vehicle, whose outputs were easily checked, as the actual orientation of the rows is known and can be simply calculated from satellite images. Nevertheless, using an electronic compass is not as easy as setting up plug-and-play devices; it requires an in situ calibration given that electromagnetic interferences within the vehicle will alter its readings.…”
Section: Methodsmentioning
confidence: 99%
“…Figure 5 a illustrates this phenomenon for several rows with a length longer than 100 m, when the robot of Section 3 followed a regular pattern along parallel rows. The plot in Figure 5 a compares the heading values read directly from the VTG NMEA string of a GPS receiver previously validated in the field [ 23 ], with the heading measured in real time with an electronic compass fixed to the vehicle, whose outputs were easily checked, as the actual orientation of the rows is known and can be simply calculated from satellite images. Nevertheless, using an electronic compass is not as easy as setting up plug-and-play devices; it requires an in situ calibration given that electromagnetic interferences within the vehicle will alter its readings.…”
Section: Methodsmentioning
confidence: 99%
“…The robot was powered by a stack of three electric Lithium-ion batteries supplying 24 VDC and 195 Ah. The navigation system was based on local perception [23] and combined a 3D stereo/TOF (O3M150, ifm electronic GmbH, Essen, Germany), a non-rotational LiDAR rangefinder (Multi-Ray LED Scanner OMD8000-R2100-R2-2V1, Pepperl + Fuchs, Mannheim, Germany), and four ultrasonic sensors (UC2000 30GM IUR2 V15, Pepperl + Fuchs, Mannheim, Germany). The central computing unit mounted in the robot was an industrial, fan-less computer that managed the sensors through a data acquisition card (NI USB-6216, National Instruments, Austin, TX, USA).…”
Section: Autonomous Ground Robotmentioning
confidence: 99%
“…Based on the development of the Internet of Things and wireless sensor networks (Yang et al, 2020;Friha et al, 2021), image gathering is becoming easier in the agricultural field. Moreover, smart applications based on agricultural images have been widely emerging in many aspects of agriculture, such as plant disease identification (Nagasubramanian et al, 2019;, crop pest recognition (Ayan et al, 2020;Liu and Wang, 2020;Mandal et al, 2021;Wang et al, 2021), fruits identification (Gao et al, 2020;Fu et al, 2021), yield forecasting (Schauberger et al, 2020;Shahhosseini et al, 2020;Jarlan et al, 2021), vision navigation (Kanagasingham et al, 2020;Rovira-Más et al, 2020;Emmi et al, 2021), and agricultural robot (Chen et al, 2020b;Guo et al, 2020;Wen et al, 2020;Zhang et al, 2020), etc. Although many remarkable achievements in the above typical aspects exist, the shortcomings of the current intelligent learning method are also revealed.…”
Section: Introductionmentioning
confidence: 99%