2019
DOI: 10.1007/978-3-030-12939-2_23
|View full text |Cite
|
Sign up to set email alerts
|

NRST: Non-rigid Surface Tracking from Monocular Video

Abstract: We propose an efficient method for non-rigid surface tracking from monocular RGB videos. Given a video and a template mesh, our algorithm sequentially registers the template non-rigidly to each frame. We formulate the per-frame registration as an optimization problem that includes a novel texture term specifically tailored towards tracking objects with uniform texture but fine-scale structure, such as the regular microstructural patterns of fabric. Our texture term exploits the orientation information in the m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 41 publications
(77 reference statements)
0
1
0
Order By: Relevance
“…SfT methods commonly operate on individual images, and although they provide 3D correspondences with a template, the reconstructions can suffer from frame‐to‐frame jitter. Deviating from this, a few works [YRCA15, HXR*18] employ an explicit temporal regularization term and ϕ‐SfT outputs temporally smooth surfaces owing to simulation. Besides, exploring joint optimization over multiple frames is promising and tractable for SfT due to advances in GPUs.…”
Section: State‐of‐the‐art Methodsmentioning
confidence: 99%
“…SfT methods commonly operate on individual images, and although they provide 3D correspondences with a template, the reconstructions can suffer from frame‐to‐frame jitter. Deviating from this, a few works [YRCA15, HXR*18] employ an explicit temporal regularization term and ϕ‐SfT outputs temporally smooth surfaces owing to simulation. Besides, exploring joint optimization over multiple frames is promising and tractable for SfT due to advances in GPUs.…”
Section: State‐of‐the‐art Methodsmentioning
confidence: 99%