2010 IEEE/RSJ International Conference on Intelligent Robots and Systems 2010
DOI: 10.1109/iros.2010.5649136
|View full text |Cite
|
Sign up to set email alerts
|

Mobile robot vision navigation & localization using Gist and Saliency

Abstract: Abstract-We present a vision-based navigation and localization system using two biologically-inspired scene understanding models which are studied from human visual capabilities: (1) Gist model which captures the holistic characteristics and layout of an image and (2) Saliency model which emulates the visual attention of primates to identify conspicuous regions in the image. Here the localization system utilizes the gist features and salient regions to accurately localize the robot, while the navigation system… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 36 publications
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…Also, YOLO frame detection is seen as a regression problem because it finds targets from start to finish without the need for a complicated pipeline [24], making it very efficient. Moreover, YOLO outperforms other real-time systems regarding mean average precision (mAP) [25]. Goyal et al proposed a model based on the YOLOv5 object detection system to sort fruit for fruit detection and quality detection [26].…”
Section: Related Workmentioning
confidence: 99%
“…Also, YOLO frame detection is seen as a regression problem because it finds targets from start to finish without the need for a complicated pipeline [24], making it very efficient. Moreover, YOLO outperforms other real-time systems regarding mean average precision (mAP) [25]. Goyal et al proposed a model based on the YOLOv5 object detection system to sort fruit for fruit detection and quality detection [26].…”
Section: Related Workmentioning
confidence: 99%