2019
DOI: 10.48550/arxiv.1904.10014
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Linked Dynamic Graph CNN: Learning on Point Cloud via Linking Hierarchical Features

Kuangen Zhang,
Ming Hao,
Jing Wang
et al.

Abstract: Learning on point cloud is eagerly in demand because the point cloud is a common type of geometric data and can aid robots to understand environments robustly. However, the point cloud is sparse, unstructured, and unordered, which cannot be recognized accurately by a traditional convolutional neural network (CNN) nor a recurrent neural network (RNN). Fortunately, a graph convolutional neural network (Graph CNN) can process sparse and unordered data. Hence, we propose a linked dynamic graph CNN (LDGCNN) to clas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
66
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(66 citation statements)
references
References 24 publications
0
66
0
Order By: Relevance
“…These models are computationally efficient due to their excellent memory locality, but loss of the inevitable information degrades the fine-grained localization accuracy [27,20]. Instead of voxelization, developing a neutral network that consumes directly on point clouds is possible [22,47,10,26,41,31,34,46,43]. Although these point-based models naturally preserve accuracy of point location, they are usually computationally intensive.…”
Section: Point Cloud Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…These models are computationally efficient due to their excellent memory locality, but loss of the inevitable information degrades the fine-grained localization accuracy [27,20]. Instead of voxelization, developing a neutral network that consumes directly on point clouds is possible [22,47,10,26,41,31,34,46,43]. Although these point-based models naturally preserve accuracy of point location, they are usually computationally intensive.…”
Section: Point Cloud Learningmentioning
confidence: 99%
“…Input OA Latency OA<92.5 PointNet [22] 16×1024 89.2 13.6ms PointNet++ [24] 16×1024 91.9 35.3ms SO-Net [13] 8×2048 90.9 − PointGrid [11] 16×1021 92.0 − SpiderCNN [39] 8×1024 92.4 82.6ms PointCNN [15] 16×1024 92.2 221.2ms PointWeb [48] 16×1024 92.3 − PVCNN [20] 16×1024 92.4 24.2ms OA>92.5 KPConv [28] 16×6500 92.9 120.5ms DGCNN [34] 16×1024 92.9 85.8ms LDGCNN [46] 16×1024 92. Implementation details.…”
Section: Modelmentioning
confidence: 99%
“…The symbol "-" indicates that the results are not available from the references. PointNet++ [12] 90.7% ---SRN-PointNet++ [24] 91.5% ---PointASNL [25] 93.2% -95.9% -Convolution-based PointConv [26] 92.5% ---Methods A-CNN [27] 92.6% 90.3% 95.5% 95.3% SFCNN [28] 92.3% ---InterpCNN [29] 93.0% ---ConvPoint [30] 91.8% 88.5% --Graph-based ECC [23] 87.4% 83.2% 90.8% 90.0% Methods DGCNN [22] 92.2% 90.2% --LDGCNN [31] 92.9% 90.3% --Hassani et al [32] 89.1% ---DPAM [33] 91.9% 89.9% 94.6% 94.3% KCNet [34] 91.0% -94.4% -ClusterNet [35] 87.1% ---RGCNN [36] 90.5% 87.3% --LocalSpecGCN [37] 92.1% ---PointGCN [38] 89.5% 86.1% 91.9% 91.6% 3DTI-Net [39] 91.7% ---Grid-GCN [40] 93…”
Section: ) Hierarchical Prediction Architecturementioning
confidence: 99%
“…Finally, graph-based approaches create a k-nearest neighbor-or radius-graph from the input set and apply graph convolutions [65,73,87,44,64,17,9,68,21,72,88,54]. DGCNN [73] introduces the EdgeConv operator and a dynamic graph component, which reconnects the k-nearest neighbor graph inside the network.…”
Section: Related Workmentioning
confidence: 99%