2014
DOI: 10.5121/ijma.2014.6502
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Tracking Using IMU and Vision Fusion for Mobile Augmented Reality Applications

Abstract: Mobile Augmented Reality (MAR) is becoming an important cyber-physical

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 30 publications
0
13
0
Order By: Relevance
“… Camera frame {c}: This frame is attached to the camera on the mobile robot with the x-axis pointing to the image plane in the right direction and z-axis pointing along the optical axis and origin located at the camera optical center. The IMU method provides orientation of the body {b} with respect to (wrt) world frame {w} R wb and vision method provides orientation of the object {o} wrt to camera frame {c} R co [ 26 , 60 ]. …”
Section: Proposed Modeling Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“… Camera frame {c}: This frame is attached to the camera on the mobile robot with the x-axis pointing to the image plane in the right direction and z-axis pointing along the optical axis and origin located at the camera optical center. The IMU method provides orientation of the body {b} with respect to (wrt) world frame {w} R wb and vision method provides orientation of the object {o} wrt to camera frame {c} R co [ 26 , 60 ]. …”
Section: Proposed Modeling Methodsmentioning
confidence: 99%
“…Pose estimation has been studied in past and recent times for applications in object positioning [ 7 ], robotics, and augmented reality (AR) tracking [ 26 ]. This section will discuss the existing technologies used for pose estimation in our environment these days.…”
Section: Related Workmentioning
confidence: 99%
“…In [11], two sensor fusion methods based on EKF were developed to estimate the pose of the monocular camera with the IMU sensor. One report of the literature [3] describes a robust tracking system that combines the IMU sensor and camera for Mobile Augmented Reality (MAR) applications. A relative position estimator for AUVs [6] was developed by fusing a monocular camera with an accelerometer and a rate gyrosensor based on EKF.…”
Section: Related Workmentioning
confidence: 99%
“…Integrating multi-sensor data based on the benefits of each sensor can achieve more wide range, accurate, rapid, and robust measurement than measurement by a single sensor. In AR research, sensor fusion based on vision sensing is used to track objects correctly in an real image sequence and to overlay virtual images on it (e.g., [1], [2] [3]). In the area of human measurement, vision-based sensor fusion is used to measure human body motion [4].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation