2012 IEEE International Conference on Robotics and Automation 2012
DOI: 10.1109/icra.2012.6224564
|View full text |Cite
|
Sign up to set email alerts
|

Complementary filtering approach to orientation estimation using inertial sensors only

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
28
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 46 publications
(28 citation statements)
references
References 9 publications
0
28
0
Order By: Relevance
“…To estimate the velocity of the robot, terrain adaptive odometry method [9] is used and combined with IMU data and information provided by the ICP using the Extended Kalman filter (EKF). The precise and stable pitch and roll angles are obtained using a complementary filter [23]. In addition to this information, currents in the flipper and the main track drives are used to provide the knowledge about the weight distribution and ground contacts.…”
Section: State Representation and Feature Selectionmentioning
confidence: 99%
“…To estimate the velocity of the robot, terrain adaptive odometry method [9] is used and combined with IMU data and information provided by the ICP using the Extended Kalman filter (EKF). The precise and stable pitch and roll angles are obtained using a complementary filter [23]. In addition to this information, currents in the flipper and the main track drives are used to provide the knowledge about the weight distribution and ground contacts.…”
Section: State Representation and Feature Selectionmentioning
confidence: 99%
“…Thus, the results of the analysis might prove beneficial to anyone asking the fundamental question: What is actually the best way to implement the EKF? Therefore, objective of this paper lies in comparing four different approaches: a nonlinear model (NLM) [6] and an error model (ERM) [7], each with and without a complementary filter (CF) for attitude estimation [8], [9]. The performance of attitude estimation using the CF was thoroughly evaluated as part of our previous work in [8], including testing of various filters to cope with inertial signals strongly affected by vibration.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, objective of this paper lies in comparing four different approaches: a nonlinear model (NLM) [6] and an error model (ERM) [7], each with and without a complementary filter (CF) for attitude estimation [8], [9]. The performance of attitude estimation using the CF was thoroughly evaluated as part of our previous work in [8], including testing of various filters to cope with inertial signals strongly affected by vibration. In this paper, dead reckoning is represented by estimation of the six-degree-of-freedom (6-DOF) pose of the UGV, using the proprioceptive sensors only: odometry obtained from motor encoders and an inertial measurement unit (IMU) that consists of accelerometers and gyroscopes providing specific force and angular rate measurements [10].…”
Section: Introductionmentioning
confidence: 99%
“…The kinematics model is established by using odometry, equation (1).The mobile robot is assumed to be traveling on a flat surface and ignore the question of wheel slippage. To describe mobile robot's kinematics model in this paper, establish the global coordinates robot coordinates Fig.…”
Section: Kinematics Modelmentioning
confidence: 99%
“…Autonomous mobile robot's position determination has been the subject of intelligent field, if mobile robot know its position and heading angle timely, it is a prerequisite in executing given tasks [1].The recent progress in sensor technologies and increase in microcontroller unit's computational power brings great experimental platform for research on localization of autonomous robots [2].The usual method for position and heading angle estimation of a mobile robot is using only odometry. This method using the data obtained from the mobile sensor like encoder to estimate the change of the position of mobile robot with time.…”
Section: Introductionmentioning
confidence: 99%