Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World
DOI: 10.1109/iros.1997.656612
|View full text |Cite
|
Sign up to set email alerts
|

Vision-based perception for an automated harvester

Abstract: This paper describes a vision-based perception system which has been used to guide an automated harvester cutting fields of alfalfa hay. The system tracks the boundary between cut and uncut crop; indicates when the end of a crop row has been reached; and identifies obstacles in the harvester's path. The system adapts to local variations in lighting and crop conditions, and explicitly models and removes noise due to shadow.Injield tests, the machine has successfully operated in four different locations, at site… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
48
0

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 67 publications
(48 citation statements)
references
References 8 publications
0
48
0
Order By: Relevance
“…For example, an automated harvester used an adaptive vision-based classifier to track the cut/uncut line in an alfalfa field (Ollis & Stentz, 1997). This system was able to autonomously harvest hundreds of acres of crop in various fields and lighting conditions and included vision-based techniques for end-of-row detection and simple color-based obstacle detection (Pilarski et al, 2002).…”
Section: History and Prior Artmentioning
confidence: 99%
“…For example, an automated harvester used an adaptive vision-based classifier to track the cut/uncut line in an alfalfa field (Ollis & Stentz, 1997). This system was able to autonomously harvest hundreds of acres of crop in various fields and lighting conditions and included vision-based techniques for end-of-row detection and simple color-based obstacle detection (Pilarski et al, 2002).…”
Section: History and Prior Artmentioning
confidence: 99%
“…A row-following system (using odometry and machine vision) for harvesting in cauliflower fields was developed by Marchant et al [49]. At Carnegie-Mellon Robotics Institute, an autonomous vehicle for cutting forage using vision-based perception on the cut and uncut regions of crop was developed [59,60]. Noguchi et al [53] combined computer vision with fuzzy logic, genetic algorithms, and neural networks in a system for ''smart spraying'' for weed control and detecting crop growth.…”
Section: Agricultural Vehiclesmentioning
confidence: 99%
“…The criterion Q(l, r) which we seek to optimize is the difference between the average values of some characteristic J within the image road region road (l, r) (as defined by the candidate left and right edge curves and the borders of the image) and that characteristic in the region outside the road offroad (l, r) (see [8] for a related approach applied to autonomous harvesting). For MAP estimation, we combined this with a weak Gaussian prior on edge locations p(l, r) to obtain the following expression:…”
Section: Constrained Road Segmentationmentioning
confidence: 99%