2022
DOI: 10.1109/tpami.2021.3098789
|View full text |Cite
|
Sign up to set email alerts
|

Cylindrical and Asymmetrical 3D Convolution Networks for LiDAR-Based Perception

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 70 publications
(34 citation statements)
references
References 61 publications
0
34
0
Order By: Relevance
“…To this end, the sparse convolution [7], [9] is employed to reduce the computational cost. Zhu et al [15] propose a voxel division method with asymmetric convolution based on LiDAR point cloud distribution.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…To this end, the sparse convolution [7], [9] is employed to reduce the computational cost. Zhu et al [15] propose a voxel division method with asymmetric convolution based on LiDAR point cloud distribution.…”
Section: Related Workmentioning
confidence: 99%
“…Duerr et al [26] propose a recurrent segmentation architecture using range images, which recursively aggregate the features of previous scans in order to exploit the short term temporal dependencies. In [15], superimposing point clouds in 3D space is adopted for multiple scans segmentation, whose memory consumption and computational time increase linearly with the total number of scans per input model. In this paper, we introduce an efficient range residual image representation, where the effective features can be extracted by Meta-Kernel.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…MinkowskiNet [39] utilizes the sparse convolutions to efficiently perform semantic segmentation on the voxelized large-scale point clouds. SqueezeSeg [5] views LiDAR point clouds as range images while PolarNet [6] and Cylinder3D [11], [40], [41] divide the LiDAR point clouds under the polar and cylindrical coordinate systems.…”
Section: Related Workmentioning
confidence: 99%