2010
DOI: 10.1016/j.laa.2009.04.017
|View full text |Cite
|
Sign up to set email alerts
|

Least squares problems with inequality constraints as quadratic constraints

Abstract: Linear least squares problems with box constraints are commonly solved to find model parameters within bounds based on physical considerations. Common algorithms include Bounded Variable Least Squares (BVLS) and the Matlab function lsqlin. Here, the goal is to find solutions to ill-posed inverse problems that lie within box constraints. To do this, we formulate the box constraints as quadratic constraints, and solve the corresponding unconstrained regularized least squares problem. Using box constraints as qua… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 29 publications
(28 citation statements)
references
References 16 publications
0
28
0
Order By: Relevance
“…The linear regression model calculates the coefficients X s k of each metric k for a particular segment s accompanied with an interception coefficient C. Coefficients are found using the least squares problem method by minimising the sum of the squared error (min Ax − b 2 2 ) where in our case A represents a matrix of our metrics observations and b is their corresponding time metrics (Mead and Renaut 2010;Lawson and Hanson 1974). We experimented with six different approaches using this model (The name in the brackets specifies the id given to each of the approaches):…”
Section: Performance Metrics Weightsmentioning
confidence: 99%
“…The linear regression model calculates the coefficients X s k of each metric k for a particular segment s accompanied with an interception coefficient C. Coefficients are found using the least squares problem method by minimising the sum of the squared error (min Ax − b 2 2 ) where in our case A represents a matrix of our metrics observations and b is their corresponding time metrics (Mead and Renaut 2010;Lawson and Hanson 1974). We experimented with six different approaches using this model (The name in the brackets specifies the id given to each of the approaches):…”
Section: Performance Metrics Weightsmentioning
confidence: 99%
“…This is an extension of the scalar χ 2 method [21,22,23,31] which can be viewed as a regularization method. The new method amounts to solving multiple χ 2 tests to give an equal number of equations as the number of unknowns in the diagonal weighting matrix for data or parameter misfits.…”
Section: Discussionmentioning
confidence: 99%
“…This statistical interpretation of the weights in (3) and multipliers in (8) gives us a method for calculating them, and we term it the χ 2 method for parameter estimation and uncertainty quantification [21,22,23,31].…”
Section: Regularization and Constrained Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…For a problem solution the generalized method of least squares is applied to an estimation of parameters. In [14] the problem o f an estimation of parameters of plant with applicat ion of a method of least squares and square restrictions is studied. For construction of restrictions physical knowledge and a regularization method are used.…”
Section: Introductionmentioning
confidence: 99%