2021 IEEE Sensors 2021
DOI: 10.1109/sensors47087.2021.9639772
|View full text |Cite
|
Sign up to set email alerts
|

S2L-SLAM: Sensor Fusion Driven SLAM using Sonar, LiDAR and Deep Neural Networks

Abstract: The use of different modalities improves the perception of the environment in situations where the conventional sensors fail (camera and LiDAR). The inclusion of these modalities, such as sonar or radar, is however difficult as existing methods for the conventional sensors usually do not generalise well on these different environment representations. We experiment with a modality prediction method to keep using the existing methodologies and allow to separate the sensing system from the navigation stack of an … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…Under the umbrella of Artificial Intelligence (AI), these methods enable UUVs to delve deeper into the intricacies of underwater environments through sophisticated data processing. Deep Learning-based SLAM algorithms [78] empower UUVs with advanced cognitive abilities to make real-time decisions, adapt to dynamic underwater conditions, and navigate with unparalleled accuracy [39]. Deep Learning algorithms excel at extracting intricate patterns and representations from sensor data, encompassing visual, inertial, and acoustic inputs.…”
Section: Advantage Of Deep Learning Relative To the Conventional Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Under the umbrella of Artificial Intelligence (AI), these methods enable UUVs to delve deeper into the intricacies of underwater environments through sophisticated data processing. Deep Learning-based SLAM algorithms [78] empower UUVs with advanced cognitive abilities to make real-time decisions, adapt to dynamic underwater conditions, and navigate with unparalleled accuracy [39]. Deep Learning algorithms excel at extracting intricate patterns and representations from sensor data, encompassing visual, inertial, and acoustic inputs.…”
Section: Advantage Of Deep Learning Relative To the Conventional Methodsmentioning
confidence: 99%
“…The accuracy and resilience of underwater SLAM systems through sensor fusion techniques encompasses vision-inertial SLAM, laser-vision SLAM, and multisensor SLAM [37,38]. Multisensor fusion is classified into data layer, feature layer, and decision layer fusion [39,40]. Visual SLAM faces challenges with low-quality images, while IMU-assisted sensors improve To enhance the accuracy and robustness of underwater SLAM systems, researchers often combine multiple sensors, leveraging sensor fusion techniques.…”
Section: Multiple Sensor Integration In Slams Odometry: Strengths And...mentioning
confidence: 99%
See 1 more Smart Citation
“…At this stage, SonoTraceLab supports a single emitter and multiple receivers, as this is a simulation setup which is applicable for many application scenarios, both in engineered sonar sensors [58], [59], [60], [61] as in biologically relevant echolocation setups, as both bats and dolphins can be approximated by a point-source with a known directivity [62]. The emitter and the receiver arrays are defined in the local sensor coordinate system [X s , Y s , Z s ], as shown in figure 1.…”
Section: A Model Preparationmentioning
confidence: 99%