2010
DOI: 10.1541/ieejeiss.130.1395
|View full text |Cite
|
Sign up to set email alerts
|

An Improvement of Positional Accuracy for View-Based Navigation Using SURF

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
10
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 11 publications
0
10
0
Order By: Relevance
“…Figures 12(a), 12(b), and 12(c) shows images of the experimental environment captured at the start position on the training and estimated paths in , the maximum value for the error in the position is 7 cm and 18 cm, respectively, and the average value, 4 cm and 5 cm. Even when the estimated path is on an angle with respect to the training path, the authors confirmed a position precision of a level similar to that of the conventional method [13]. Moreover, the estimation errors for the rotation in the estimated position indicated by the black triangles in Figs.…”
Section: Experiments When the Estimated Path Ismentioning
confidence: 52%
See 3 more Smart Citations
“…Figures 12(a), 12(b), and 12(c) shows images of the experimental environment captured at the start position on the training and estimated paths in , the maximum value for the error in the position is 7 cm and 18 cm, respectively, and the average value, 4 cm and 5 cm. Even when the estimated path is on an angle with respect to the training path, the authors confirmed a position precision of a level similar to that of the conventional method [13]. Moreover, the estimation errors for the rotation in the estimated position indicated by the black triangles in Figs.…”
Section: Experiments When the Estimated Path Ismentioning
confidence: 52%
“…(a) and (b), the maximum value for the error in the position is 7 cm and 18 cm, respectively, and the average value, 4 cm and 5 cm. Even when the estimated path is on an angle with respect to the training path, the authors confirmed a position precision of a level similar to that of the conventional method .…”
Section: Experiments To Estimate the Position And Rotation When Off Thmentioning
confidence: 67%
See 2 more Smart Citations
“…Therefore, there is a need for a mechanism of human detection from bird's eye view, and a number of detection methods were proposed for this purpose. [6][7][8] Human detection models based on HoG features were proposed in the literature 6,7 ; however, HoG features are a histogram description of image intensity and direction, and are therefore sensitive to threedimensional (3D) appearance change of detected objects. In addition, human recognition was implemented via feature point matching using speed-up robust features (SURF), 8 but SURF points are also sensitive to 3D appearance change.…”
Section: Introductionmentioning
confidence: 99%