2021
DOI: 10.1016/j.jprocont.2021.01.004
|View full text |Cite
|
Sign up to set email alerts
|

A robust modifier adaptation method via Hessian augmentation using model uncertainties

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 30 publications
0
6
0
Order By: Relevance
“…The drawback is the need to estimate second order information of the plant based on plant measurements. More recently, [12] used an aggregate model, which combined several cision variables are referred to, in this paper, as "controls". models of the plant.…”
Section: Modifier Adaptation (Ma)mentioning
confidence: 99%
See 1 more Smart Citation
“…The drawback is the need to estimate second order information of the plant based on plant measurements. More recently, [12] used an aggregate model, which combined several cision variables are referred to, in this paper, as "controls". models of the plant.…”
Section: Modifier Adaptation (Ma)mentioning
confidence: 99%
“…Since the plant optimum is unknown beforehand, this could only be guaranteed a priori by enforcing that all models are convex over the whole feasible input space or by convexifying them (see, e.g. [12]). Despite no proof of guaranteed convergence, the case studies in the following section illustrate the benefits of our algorithms when an appropriated model set is chosen.…”
Section: Remark On Formal Guarantees and Model Convergencementioning
confidence: 99%
“…Using this assumption of parametrically bounded second derivatives, an approach known as directional Hessian modifier adaptation (DHMA) was developed . By setting the second directional derivative of the modified model constraint equal to the maximum second directional derivative of the set of models, where refers to the upper-bounded second directional with respect to θ .…”
Section: Feasible-side Convergence Under Structural Uncertaintymentioning
confidence: 99%
“…Proposition 2 (WCMA−KKT matching upon convergence). Consider the WCMA problem, given by (31), with no measurement noise and perfect estimates of the gradients using either a constant filter or the new filter given by ( 26), let u p * be a KKT point of the plant [given by ( 1)], and assume that (31) has converged to the fixed point u ∞ = lim k→∞ u k . Then, at u p *, the KKT conditions of (31) match those of the plant.…”
Section: Robust Ma Approachesmentioning
confidence: 99%
See 1 more Smart Citation