2008
DOI: 10.1016/j.compag.2007.07.006
|View full text |Cite
|
Sign up to set email alerts
|

A vision based row detection system for sugar beet

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
65
0
7

Year Published

2009
2009
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 139 publications
(77 citation statements)
references
References 12 publications
0
65
0
7
Order By: Relevance
“…Since our algorithm is based on Tillett and Hague (1999), the row recognition errors will be in the same range of 13 to 18 mm. Bakker et al (2008) reported errors of row recognition with a Hough transform algorithm between 5 and 11 mm. Both Tillett and Hague (1999) and Bakker et al (2008) do not report on the dimensions of the crop row width, although crop row width and growth stage are important parameters for weed detection and weed control systems as well.…”
Section: Crop Row Recognition and Crop Row Widthmentioning
confidence: 99%
See 1 more Smart Citation
“…Since our algorithm is based on Tillett and Hague (1999), the row recognition errors will be in the same range of 13 to 18 mm. Bakker et al (2008) reported errors of row recognition with a Hough transform algorithm between 5 and 11 mm. Both Tillett and Hague (1999) and Bakker et al (2008) do not report on the dimensions of the crop row width, although crop row width and growth stage are important parameters for weed detection and weed control systems as well.…”
Section: Crop Row Recognition and Crop Row Widthmentioning
confidence: 99%
“…Bakker et al (2008) reported errors of row recognition with a Hough transform algorithm between 5 and 11 mm. Both Tillett and Hague (1999) and Bakker et al (2008) do not report on the dimensions of the crop row width, although crop row width and growth stage are important parameters for weed detection and weed control systems as well. Tellaeche et al (2008) reported a visionbased algorithm that takes into account the crop growth stage for weed detection as well.…”
Section: Crop Row Recognition and Crop Row Widthmentioning
confidence: 99%
“…Accordingly, alternative or complementary techniques are required for precise navigation control. The best practice is to use an actual view of the plant lines to determine the correct direction to navigate [2,[9][10][11]. For an automobile, navigational laser scanners and camera systems are common [12]; however, laser scanners are expensive and optimized for automobile applications; thus, they have large and overlapping spots to ensure the safe detection of all potential obstacles instead of centimeter-scale resolution [13,14].…”
Section: Introductionmentioning
confidence: 99%
“…Although different formulas can be used to process NDVI signals and images, they always result in a grayscale or binary image with an adequate threshold. Both types of images can be used for high-resolution navigation control, and most high-resolution navigation applications use Hough transformations to detect plant rows [2,[9][10][11]17], the cross-correlation is a simpler approach in terms of processing power for a small embedded system. The combination of these images with a mask representing the plant line, for cross-correlation result in a precise position signal, which can be used to (lock-in) follow along the plant line.…”
Section: Introductionmentioning
confidence: 99%
“…Visually-Aided Localisation for Autonomous Agricultural Vehicles 27 Bossu et al 2009;Bakker et al 2008). This strategy of segmenting plants based on colour, makes use of information in multiple spectral bands and has been found to be more robust to illumination changes and shadows than using NIR cameras.…”
Section: Monocular Vision Crop Row Trackingmentioning
confidence: 99%