2009
DOI: 10.1109/tie.2009.2012457
|View full text |Cite
|
Sign up to set email alerts
|

Ceiling-Based Visual Positioning for an Indoor Mobile Robot With Monocular Vision

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 106 publications
(12 citation statements)
references
References 27 publications
0
12
0
Order By: Relevance
“…In this case we would not be able to ascertain the circularity of the estimation cloud. Hence, as complementary information, we also define a circularity index (CI) as: (27) where λ i min and λ i max are, respectively, the minimum and maximum value of the eigenvalues pair {λ 1 , λ 2 } appearing in (2). This index is evaluated at every position in the BLC grid.…”
Section: Infrared Measurementsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this case we would not be able to ascertain the circularity of the estimation cloud. Hence, as complementary information, we also define a circularity index (CI) as: (27) where λ i min and λ i max are, respectively, the minimum and maximum value of the eigenvalues pair {λ 1 , λ 2 } appearing in (2). This index is evaluated at every position in the BLC grid.…”
Section: Infrared Measurementsmentioning
confidence: 99%
“…A comprehensive review where different approaches and their features and performance can be found in [26]. In many works different sensors are used in a cooperative (not fusion) way, as in [20] with a camera and a LIDAR (Laser Imaging Detection and Ranging) or in [27] with camera and odometry, both for robot navigation applications, or [24] where ArUco markers [28] (widely known encoded markers, also used in this work) and an IMU cooperate for a drone navigation and landing application. There is no dominant approach in fusion of cameras with other sensors in positioning and navigation applications.…”
Section: Introductionmentioning
confidence: 99%
“…A comprehensive review where different approaches and their features and performance can be found in [26]. In many works different sensors are used in a cooperative (not fusion) way, as in [20] with a camera and a Laser Imaging Detection and Ranging (LIDAR) or in [27] with camera and odometry, both for robot navigation applications, or [24] where ArUco markers [28] (widely known encoded markers, also used in this work) and an IMU cooperate for a drone navigation and landing application. There is no dominant approach in fusion of cameras with other sensors in positioning and navigation applications.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, the camera's lens may be occluded by dynamic objects (e.g., pedestrians) or static objects (e.g., furniture), which seriously limits the application of off-board visual localization method in indoor environments. On-board visual localization method with cameras directly mounted on the robot is an additional localization method that shows great promise [21][22][23]. The key objective is to recognize natural or artificial landmarks with known locations and then calculate the distances between the robot and the landmarks; consequently, the robot's location can be readily determined using trilateration.…”
Section: Introductionmentioning
confidence: 99%