2021
DOI: 10.3390/s21062025
|View full text |Cite
|
Sign up to set email alerts
|

IT-SVO: Improved Semi-Direct Monocular Visual Odometry Combined with JS Divergence in Restricted Mobile Devices

Abstract: Simultaneous localization and mapping (SLAM) has a wide range for applications in mobile robotics. Lightweight and inexpensive vision sensors have been widely used for localization in GPS-denied or weak GPS environments. Mobile robots not only estimate their pose, but also correct their position according to the environment, so a proper mathematical model is required to obtain the state of robots in their circumstances. Usually, filter-based SLAM/VO regards the model as a Gaussian distribution in the mapping t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 33 publications
0
7
0
Order By: Relevance
“…It is difficult to calculate W(p r ,p g ) directly via Equation (14). According to Kantorovich-Rubinstein duality [19],…”
Section: Loss Function Improvementsmentioning
confidence: 99%
“…It is difficult to calculate W(p r ,p g ) directly via Equation (14). According to Kantorovich-Rubinstein duality [19],…”
Section: Loss Function Improvementsmentioning
confidence: 99%
“…However, similar to other V-SLAM methodologies, SVO is not exempt from limitations, with performance potentially degrading in environments with sparse features or under poor lighting conditions. Therefore, the question arises as to how we can harness the strengths of both SVO and ORB-SLAM2 to achieve an even more robust and efficient V-SLAM system [40,41].…”
Section: Semi-direct Visual Odometrymentioning
confidence: 99%
“…UAV state estimation is combined with its surrounding environment in which a moving UAV collects data simultaneously through embedded sensors. The sensor calibration determines the accuracy of the UAV orientation and position [46]. The environment map includes landmark coordinates and orientation, and its accuracy is determined by sensor calibration requiring embedded sensors to continuously collect data [72].…”
Section: Collaborative or Multiple Uav Slammentioning
confidence: 99%
“…• Perception is the ability of a UAV to discern meaningful information from its sensors to understand the environment [46]. Both localization and map building enhance a UAV's perception.…”
mentioning
confidence: 99%