2016
DOI: 10.14257/ijsip.2016.9.11.03
|View full text |Cite
|
Sign up to set email alerts
|

Visual Location Recognition Based on Coarse-to-Fine Image Retrieval and Epipolar Geometry Constraint for Urban Environment

Abstract: Visual based location recognition of a mobile device is an important problem in

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…This problem is very challenging under active research, over the past few years, particularly with the increase in the number of images and datasets that are publicly available on the Internet, which provide an opportunity for research in predicting the geographic location of images. Due to the possibility of various applications in landmark recognition [1][2][3], urban reconstruction [4], place recognition [5][6][7], visual navigation [8,9], building recognition [10,11], and robot vision [12,13], attention from the research community has been drawn to the location estimation of images [14][15][16][17][18][19][20][21] over the past decade.…”
Section: Introductionmentioning
confidence: 99%
“…This problem is very challenging under active research, over the past few years, particularly with the increase in the number of images and datasets that are publicly available on the Internet, which provide an opportunity for research in predicting the geographic location of images. Due to the possibility of various applications in landmark recognition [1][2][3], urban reconstruction [4], place recognition [5][6][7], visual navigation [8,9], building recognition [10,11], and robot vision [12,13], attention from the research community has been drawn to the location estimation of images [14][15][16][17][18][19][20][21] over the past decade.…”
Section: Introductionmentioning
confidence: 99%