2006 IEEE/RSJ International Conference on Intelligent Robots and Systems 2006
DOI: 10.1109/iros.2006.282620
|View full text |Cite
|
Sign up to set email alerts
|

An Embedded Optical Flow Processor for Visual Navigation using Optical Correlator Technology

Abstract: -The conceptual design of an embedded high performance opto-electronic optical flow processor is presented, which is designed for navigation applications in the field of robotics (ground, aerial, marine) and space (satellites, landing vehicles). It is based on 2D fragment image motion determination by 2D correlation. To meet the real-time performance requirements the principle of joint transform correlation (JTC) and advanced optical correlator technology is used. The paper recalls briefly the underlying princ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
7
0

Year Published

2006
2006
2014
2014

Publication Types

Select...
2
2
2

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 20 publications
0
7
0
Order By: Relevance
“…A detailed sensitivity analysis within a window range from 8x8 to 64x64 pixels as shown an optimal window size of 24x24 pixels at a distance of 12 pixels (1/2 of correlated fragment size), making the best compromise between the accuracy/reliability of correlation and preservation of small details of the underlying 3D scene 14 . Expected performances of the optical flow processor have been estimated on the base of the conceptual design of the processor and results of simulation experiments, taking into account also the test results of the existing hardware models of the optical correlator made within previous studies [10][11] .…”
Section: Advanced Opto-electronic Optical Flow Processormentioning
confidence: 99%
“…A detailed sensitivity analysis within a window range from 8x8 to 64x64 pixels as shown an optimal window size of 24x24 pixels at a distance of 12 pixels (1/2 of correlated fragment size), making the best compromise between the accuracy/reliability of correlation and preservation of small details of the underlying 3D scene 14 . Expected performances of the optical flow processor have been estimated on the base of the conceptual design of the processor and results of simulation experiments, taking into account also the test results of the existing hardware models of the optical correlator made within previous studies [10][11] .…”
Section: Advanced Opto-electronic Optical Flow Processormentioning
confidence: 99%
“…Most of these visual systems were quite demanding in terms of their computational requirements and/or their weight or were not very well characterized, except for the optical mouse sensors [4], with which a standard error of approximately ±5 • /s around 25 • /s was obtained in the case of an optical mouse sensor measuring motion in a ±280 • /s overall range. However, to our knowledge, very few studies have been published so far in which optic flow systems have been implemented and tested outdoors onboard an unmanned aircraft subject to vibrations, where the illuminance cannot be easily controlled (see [1] in the case of linear 1-D motion sensors and see [18,20,27,49] in that of 2-D optic flow sensors). A particular effort has been made in this study to cope the sensor's measurement range [1.5 • /s; 25 • /s] with the one experienced during a lunar landing approach phase approximately of [2 • /s; 6 • /s].…”
Section: Introductionmentioning
confidence: 99%
“…Apollo 11). Newer approaches include lidar techniques [1], [2] and visual techniques [3], [4], [5], [6], [7], [8], [9] often supported with inertial measurements. In addition, vision-based navigation plays a key role when it is required to detect an extraterrestrial target from afar.…”
Section: Introductionmentioning
confidence: 99%
“…These systems either use visually assisted inertial navigation systems [25] or compute the optic flow by means of an optical correlator [3], [9] or extract information from a single camera [25], [26]. By contrast, the autopilot described here extends the previously described EMD-based OCTAVE-autopilot principles [27], [28] to a lunar lander.…”
Section: Introductionmentioning
confidence: 99%