2007
DOI: 10.1093/ietisy/e90-d.2.579
|View full text |Cite
|
Sign up to set email alerts
|

High Accuracy Fundamental Matrix Computation and Its Performance Evaluation

Abstract: We compare the convergence performance of different numerical schemes for computing the fundamental matrix from point correspondences over two images. First, we state the problem and the associated KCR lower bound. Then, we describe the algorithms of three well-known methods: FNS, HEIV, and renormalization, to which we add Gauss-Newton iterations. For initial values, we test random choice, least squares, and Taubin's method. Experiments using simulated and real images reveal different characteristics of each m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2007
2007
2016
2016

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(17 citation statements)
references
References 11 publications
(7 reference statements)
0
17
0
Order By: Relevance
“…In contrast, other problems in geometric computer vision, such as Simultaneous Pose and Correspondence [24,28,31], Fundamental matrix computation [2,14] , ellipse fitting [2,13,15,17], do take into account specific models of uncertainty per observed point. In most these approaches, the uncertainty is modeled by a covariance matrix, and Maximum Likelihood strategies are proposed to minimize the Mahalanobis distances between the noisy and the true locations of the point observations.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…In contrast, other problems in geometric computer vision, such as Simultaneous Pose and Correspondence [24,28,31], Fundamental matrix computation [2,14] , ellipse fitting [2,13,15,17], do take into account specific models of uncertainty per observed point. In most these approaches, the uncertainty is modeled by a covariance matrix, and Maximum Likelihood strategies are proposed to minimize the Mahalanobis distances between the noisy and the true locations of the point observations.…”
Section: Related Workmentioning
confidence: 99%
“…As discussed in [3], estimating the global minima for this kind of problems is impractical. A feasible alternative is to minimize approximated Sampson error functions, for instance by means of iterative approaches such as the Fundamental Numerical Scheme (FNS) [2], the Heterocedastic Errors-in-Variables (HEIV) [17] or projective Gauss Newton [14]. These minimization approaches can be considered as a solution refinement and they need to be fed with an initial solution.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…(7). Indeed, substitutingf f f = f f f k into the expanded update equation (12) reveals that f f f k satisfies the equation set in (9). This equation set defines the necessary conditions for the critical points of the optimization problem of Eq.…”
Section: Derivationmentioning
confidence: 99%
“…Any such Λ 0 xi is meant to carry the bulk of information about the relative importance of the individual entries of x i (see [6], [7] for the raw covariance matrices of homography estimates and [8], [9] and Section VI for the raw covariance matrices of fundamental matrix estimates). Upon upgrading every Λ 0 xi to a corresponding corrected covariance matrix…”
Section: B Aml Cost Functionmentioning
confidence: 99%