2020 Indo – Taiwan 2nd International Conference on Computing, Analytics and Networks (Indo-Taiwan ICAN) 2020
DOI: 10.1109/indo-taiwanican48429.2020.9181333
|View full text |Cite
|
Sign up to set email alerts
|

A Cognitive Framework on Object Recognition and Localization for Robotic Vision

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 15 publications
0
1
0
Order By: Relevance
“…With the advance of technology, mobile robots have become increasingly familiar, and the pursuit of enhanced localization accuracy through multi-sensor fusion has gained substantial attention across various fields [1]. In the domain of multi-sensor fusion, achieving high-quality real-time Simultaneous Localization and Mapping (SLAM) methods has emerged as a prominent research avenue.…”
Section: Introductionmentioning
confidence: 99%
“…With the advance of technology, mobile robots have become increasingly familiar, and the pursuit of enhanced localization accuracy through multi-sensor fusion has gained substantial attention across various fields [1]. In the domain of multi-sensor fusion, achieving high-quality real-time Simultaneous Localization and Mapping (SLAM) methods has emerged as a prominent research avenue.…”
Section: Introductionmentioning
confidence: 99%
“…The technique of color segmentation for object detection in real-time vision-based applications which is implemented through a stereoscopic system is briefly portrayed in [8]. While using the method of background subtraction with an aim of object recognition and localization elaborative discussion presented in [9,10]. Furthermore, the approach of deep neural networks is implemented while framing algorithm for object detection, localization and recognition are expressed in [11][12][13][14]24].…”
Section: Introductionmentioning
confidence: 99%