Geospatial Informatics X 2020
DOI: 10.1117/12.2565585
|View full text |Cite
|
Sign up to set email alerts
|

DeepOSM-3D: recognition in aerial LiDAR RGBD imagery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…All these, individually or in combination, give an accurate 2D aerial map delineating the positions of obstacles. In addition, airborne 3D LiDAR [20] fitted on the drone can proffer depth readings that can inversely signify the height of the various obstacles. Fusing such multi-sensor information, by methods like bimodal learning [21] from RGB-D information or unified co-attention networks [22] can augment the spatial maps with depth information.…”
Section: Perception and Real-time Formation Of Mapsmentioning
confidence: 99%
“…All these, individually or in combination, give an accurate 2D aerial map delineating the positions of obstacles. In addition, airborne 3D LiDAR [20] fitted on the drone can proffer depth readings that can inversely signify the height of the various obstacles. Fusing such multi-sensor information, by methods like bimodal learning [21] from RGB-D information or unified co-attention networks [22] can augment the spatial maps with depth information.…”
Section: Perception and Real-time Formation Of Mapsmentioning
confidence: 99%
“…True 3D imaging of targets is highly desirable in many intelligence, reconnaissance, and surveillance missions because it reveals geometric information about targets and target-scenes that aid in automatic target recognition, navigation, threat-level assessment, and strategic planning. 1,2 Compared to 3D reconstruction from airborne or satellite electro-optic (EO) sensors, 3D synthetic aperture radar (SAR) imaging offers the promise of enhanced target-scene characterization in all weather (day and night), from long range, and from single or limited viewing perspectives with short dwell times, enhancing the feasibility of obtaining up-to-date target-scene models in-flight.…”
Section: Introductionmentioning
confidence: 99%