2022
DOI: 10.1049/cit2.12112
|View full text |Cite
|
Sign up to set email alerts
|

Radar style transfer for metric robot localisation on lidar maps

Abstract: Lidar and visual data are affected heavily in adverse weather conditions due to sensing mechanisms, which bring potential safety hazards for vehicle navigation. Radar sensing is desirable to build a more robust navigation system. In this paper, a cross-modality radar localisation on prior lidar maps is presented. Specifically, the proposed workflow consists of two parts: first, bird's-eye-view radar images are transferred to fake lidar images by training a generative adversarial network offline. Then with onli… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 49 publications
0
9
0
Order By: Relevance
“…Over recent years, many authors have implemented deep generative models to generate specific synthetic data from custom input data, where the generation of radar to lidar is frequently performed, as expensive lidar data can be exchanged for cheap radars with a generative model. The generative model not only performs translation between sensors, such as lidar to radar [128] or radar to lidar [101], but also translations between lidar representations in different weather conditions [129].…”
Section: Lidar Fusionmentioning
confidence: 99%
See 2 more Smart Citations
“…Over recent years, many authors have implemented deep generative models to generate specific synthetic data from custom input data, where the generation of radar to lidar is frequently performed, as expensive lidar data can be exchanged for cheap radars with a generative model. The generative model not only performs translation between sensors, such as lidar to radar [128] or radar to lidar [101], but also translations between lidar representations in different weather conditions [129].…”
Section: Lidar Fusionmentioning
confidence: 99%
“…Based on generative modelling, Yin et al [101] introduce a GAN to convert a spinning radar scan into a synthetic lidar map. Instead of sharing polar features, this network generates a synthetic lidar BEV image from a radar scan.…”
Section: Lidar Fusionmentioning
confidence: 99%
See 1 more Smart Citation
“…SLAM is the core technology for robots to understand their external environment and to orientate themselves. From the perspective of using sensors, the SLAM problem can be solved by visual sensor [1], radar sensor [2], and multi‐module fusion [3]. However, natural working environments contain lots of moving objects, such as people and other mobile robots.…”
Section: Introductionmentioning
confidence: 99%
“…Today, robots are found in different applications. There is a growing need for high-speed, robotic assembly of small parts [ [1] , [2] , [3] , [4] , [5] , [6] , [7] , [8] ]. Robots and virtual reality (VR) are currently of paramount relevance to the advancement of humanity [ 9 , 10 ].…”
Section: Introductionmentioning
confidence: 99%