2022
DOI: 10.1007/978-3-031-20074-8_39
|View full text |Cite
|
Sign up to set email alerts
|

MPPNet: Multi-frame Feature Intertwining with Proxy Points for 3D Temporal Object Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
37
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 50 publications
(37 citation statements)
references
References 35 publications
0
37
0
Order By: Relevance
“…To resolve such limitations, 3D-MAN [31] first attempts to employ the attention mechanism to align different views of 3D objects and then exploits a memory bank to store and aggregate multi-frame features for long sequence. Recently, Offboard3D [18] and MPPNet [2] improve much the detection performance, where they associate the detected boxes from each frame of the sequence as proposal trajectories, and extract high-quality proposal features by sampling sequential point cloud on the trajectories.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…To resolve such limitations, 3D-MAN [31] first attempts to employ the attention mechanism to align different views of 3D objects and then exploits a memory bank to store and aggregate multi-frame features for long sequence. Recently, Offboard3D [18] and MPPNet [2] improve much the detection performance, where they associate the detected boxes from each frame of the sequence as proposal trajectories, and extract high-quality proposal features by sampling sequential point cloud on the trajectories.…”
Section: Related Workmentioning
confidence: 99%
“…Our MSF method also samples points from the sequence, but it differs from those methods with proposal trajectories [2,18] in that we only generate proposals on the current frame and propagate them to explore features in preceding frames. This makes our method much more efficient and favorable to online detection systems.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations