2004
DOI: 10.1007/978-3-540-24671-8_23
|View full text |Cite
|
Sign up to set email alerts
|

Structure and Motion from Images of Smooth Textureless Objects

Abstract: Abstract. This paper addresses the problem of estimating the 3D shape of a smooth textureless solid from multiple images acquired under orthographic projection from unknown and unconstrained viewpoints. In this setting, the only reliable image features are the object's silhouettes, and the only true stereo correspondence between pairs of silhouettes are the frontier points where two viewing rays intersect in the tangent plane of the surface. An algorithm for identifying geometrically-consistent frontier points… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2006
2006
2022
2022

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(19 citation statements)
references
References 23 publications
0
19
0
Order By: Relevance
“…Although these methods have given good results, their main drawback is the limited number of epipolar tangency points per pair of images, generally only two: one at the top and one at the bottom of the silhouette. When additional epipolar tangency points are available, the goal is to match them across different views and handle their visibility, as proposed in [15] and [19]. An additional limitation of all these methods is their inability to cope with partial or truncated silhouettes, as in the examples shown in Fig.…”
Section: Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Although these methods have given good results, their main drawback is the limited number of epipolar tangency points per pair of images, generally only two: one at the top and one at the bottom of the silhouette. When additional epipolar tangency points are available, the goal is to match them across different views and handle their visibility, as proposed in [15] and [19]. An additional limitation of all these methods is their inability to cope with partial or truncated silhouettes, as in the examples shown in Fig.…”
Section: Previous Workmentioning
confidence: 99%
“…Silhouettes have already been used for camera motion estimation using the notion of epipolar tangency points [7], [8], [13], i.e., points on the silhouette contours in which the tangent to the silhouette is an epipolar line. A rich literature exists on exploiting epipolar tangents, both for orthographic cameras [7], [9], [14], [15] and perspective cameras [16], [17], [18], [19]. In particular, the works of [17] and [18] use only the two outermost epipolar tangents, which eliminates the need for matching corresponding epipolar tangents across different images.…”
Section: Previous Workmentioning
confidence: 99%
“…They satisfy therefore what is called the generalized epipolar constraint [3]. They allow hereby projective reconstruction when localized in images [5,6]. The connection between the generalized epipolar constraint and the pairwise tangency constraint (3) is that the latter implies the former at particular frontier points.…”
Section: Connection With Frontier Pointsmentioning
confidence: 99%
“…in [4] inflexions of the silhouette boundary are used to detect frontier points from which motion is derived, in [5] infinite 4D spaces are explored using random samples and in [6] contour signatures are used to find potential frontier points. All these approaches require frontier points to be identified on the silhouette contours prior to camera parameter estimation.…”
Section: Introductionmentioning
confidence: 99%
“…Unlike other algorithms, e.g. [6], who search for all possible frontier points and epipolar tangents on a single silhouette, we only search for the outermost frontier points and epipolar tangents, but for many silhouettes. Only using the outermost epipolar tangents allows us to be far more efficient because the data structures are simpler and there are no self-occlusions.…”
Section: Background and Previous Workmentioning
confidence: 99%