2008 IEEE Conference on Robotics, Automation and Mechatronics 2008
DOI: 10.1109/ramech.2008.4681521
|View full text |Cite
|
Sign up to set email alerts
|

Uncalibrated Visual Servoing Using More Precise Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 8 publications
0
3
0
Order By: Relevance
“…To improve the computational efficiency, some methods do not directly approximate the Hessian matrix, only the residual term. Fu et al (2008) used a secant method to approximate the residual term and used the Levenberg–Marquardt algorithm (LMA) to ensure global convergence. Kim and Lee (2006) used the full Newton’s method and the secant method to approximate the residual.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…To improve the computational efficiency, some methods do not directly approximate the Hessian matrix, only the residual term. Fu et al (2008) used a secant method to approximate the residual term and used the Levenberg–Marquardt algorithm (LMA) to ensure global convergence. Kim and Lee (2006) used the full Newton’s method and the secant method to approximate the residual.…”
Section: Related Workmentioning
confidence: 99%
“…Fu et al (2008) suggested using a trust-region method to adjust the damping parameter μ k self-adaptively, that is, μ k is adjusted according to the actual reduction of the objective function and the predicted reduction of the quadratic model.…”
Section: Approachmentioning
confidence: 99%
“…To do away with this dependence, one could optimize for the parameters in the image Jacobian whilst the error in the image plane is being minimized. This is done for instance, using Gauss-Newton to minimize the squared image error and non-linear least squares optimization for the image Jacobian [7]; using weighted recursive least squares, not to obtain the true parameters, but instead an approximation that still guarantees asymptotic stability of the control law in the sense of Lyapunov [8], [9]; using k-nearest neighbor regression to store previously estimated local models or previous movements, and estimating the Jacobian using local least squares [10], or building a secant model using population of the previous iterates [11]. To provide robustness to outliers in the computation of the Jacobian, [12] proposes the use of an M-estimator.…”
Section: Introductionmentioning
confidence: 99%