2018
DOI: 10.1109/jsen.2018.2810060
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Gyro-Aided Optical Flow Estimation in Fast Rotations With Auto-Calibration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(10 citation statements)
references
References 29 publications
0
10
0
Order By: Relevance
“…The calibration data is shown Table 1 . Next, R ci was estimated by using the measured data from the gyroscope in the IMU sensor and the optical flow (see detail [ 16 ]). In addition, the average of R ci from 15 experiments with the CC+LS13 method in [ 16 ] are shown in Table 1 , which represents in term of quaternion.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…The calibration data is shown Table 1 . Next, R ci was estimated by using the measured data from the gyroscope in the IMU sensor and the optical flow (see detail [ 16 ]). In addition, the average of R ci from 15 experiments with the CC+LS13 method in [ 16 ] are shown in Table 1 , which represents in term of quaternion.…”
Section: Resultsmentioning
confidence: 99%
“…The camera and the IMU sensor are calibrated as follows: (1) calibrate the camera to find out the focal length ( f ) by using the camera calibration module in OpenCV (Open Source Computer Vision); (2) calibrate the gyroscope to prevent gyro drift problems with averaging of the bias offset. We can assess the bias offset by measuring the output signal of the gyroscope over a long period of time when this sensor is sitting still and reduce noise by Kalman filter; (3) estimate R ci by using the relation between gyroscope data and optical flow that was proposed by Li and Ren [ 16 ] which is using the CC+LS13 method, and (4) determine the t off between the gyroscope and the camera input by a cross-correlation (CC) method [ 28 ] to synchronize operation between both of the sensors to accurately collect data in time. Then, we can correctly estimate the ω cam .…”
Section: Proposed Frameworkmentioning
confidence: 99%
See 2 more Smart Citations
“…On the other hand, gyroscopes do not rely on image contents, which provide angular velocities in terms of roll, pitch, and yaw that can be converted into 3D motions, widely used for system control [26] and the HCI of mobiles [8]. Among all potential possibilities [2,28,14], one is to fuse the gyroscope for the motion estimation. Hwangbo et al proposed to fuse gyroscope to improve the robustness of KLT feature tracking [14].…”
Section: Introductionmentioning
confidence: 99%