2022
DOI: 10.1002/mma.8434
|View full text |Cite
|
Sign up to set email alerts
|

Generalized and optimal sequence of weights on a progressive‐iterative approximation method with memory for least square fitting

Abstract: The generalized and optimal sequence of weights on a progressive-iterative approximation method with memory for least square fitting (GOLSPIA) improves the MLSPIA method by extends to the multidimensional data fitting.In addition, weights of the moving average are varied between iterations, using the three optimal sequences of weights derived from the singular values of the collocation matrix. It is proved that a series of data fitting with an appropriate alternative of weights converge to the solution of leas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…In GLS, factors are derived by minimizing the weighted sum of squared discrepancies between the observed and estimated correlation matrices. This method is more efficient than the ULS under multivariate normality and is used when specific weighting of discrepancies is desired (31).…”
Section: Generalized Least Squares (Gls)mentioning
confidence: 99%
“…In GLS, factors are derived by minimizing the weighted sum of squared discrepancies between the observed and estimated correlation matrices. This method is more efficient than the ULS under multivariate normality and is used when specific weighting of discrepancies is desired (31).…”
Section: Generalized Least Squares (Gls)mentioning
confidence: 99%