2015 IEEE Bombay Section Symposium (IBSS) 2015
DOI: 10.1109/ibss.2015.7456637
|View full text |Cite
|
Sign up to set email alerts
|

Vision based object follower automated guided vehicle using compressive tracking and stereo-vision

Abstract: Integration of a visual sensing system plays a vital role in automated navigation by providing a sensing ability of the surrounding environment. The problem of object following is challenging due to changes in appearance that can occur due to motion, pose, illumination and occlusion. The real-time implementation of a computer vision based object following system is presented in this paper. The position of the object to be followed is determined by processing a real time image feed from a calibrated stereo-came… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…In recent years, many research works on the human-following system are vision based. Verma et al [12] presented to develop a robust vision-based tracking algorithm using speeded-up robust features (SURF). Pang et al [13] presented a tracker which is using the kernelized correlation filter (KCF) to detect the target in multiple scales.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent years, many research works on the human-following system are vision based. Verma et al [12] presented to develop a robust vision-based tracking algorithm using speeded-up robust features (SURF). Pang et al [13] presented a tracker which is using the kernelized correlation filter (KCF) to detect the target in multiple scales.…”
Section: Introductionmentioning
confidence: 99%
“…Verma et al. [12] presented to develop a robust vision‐based tracking algorithm using speeded‐up robust features (SURF). Pang et al.…”
Section: Introductionmentioning
confidence: 99%