2019
DOI: 10.1016/j.ifacol.2019.06.107
|View full text |Cite
|
Sign up to set email alerts
|

Sequential ℓ1 Quadratic Programming for Nonlinear Model Predictive Control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 16 publications
0
8
0
Order By: Relevance
“…Secondly, the absolute value function is not differentiable, which means that the classical gradient-based optimisation method cannot be used. In spite of the second of the mentioned computational difficulties, in the literature, is it possible to find applications of gradient-based nonlinear optimisation methods to solve the MPC-L optimisation task ( 5 ) [ 23 , 24 , 28 ]. The objective of this work is to derive a much more computationally simple approach to the MPC-L problem in which nonlinear optimisation is not used.…”
Section: Computationally Efficient Nonlinear Mpc Using the L Cost-functionmentioning
confidence: 99%
See 1 more Smart Citation
“…Secondly, the absolute value function is not differentiable, which means that the classical gradient-based optimisation method cannot be used. In spite of the second of the mentioned computational difficulties, in the literature, is it possible to find applications of gradient-based nonlinear optimisation methods to solve the MPC-L optimisation task ( 5 ) [ 23 , 24 , 28 ]. The objective of this work is to derive a much more computationally simple approach to the MPC-L problem in which nonlinear optimisation is not used.…”
Section: Computationally Efficient Nonlinear Mpc Using the L Cost-functionmentioning
confidence: 99%
“…Examples of nonlinear MPC-L algorithms in which the Sequential Quadratic Programming algorithm is used are described in [ 23 , 24 ]. A trust-region sequential quadratic programming method is used in [ 28 ].…”
Section: Introductionmentioning
confidence: 99%
“…Remark 1: The optimization problem (3) is numerically hard to tackle, since u is a continuous-time signal and thus the number of decision variables is infinite. The direct solution of the OCP requires a finite parametrization of the input signal u (see, e.g., [24]). For example, as illustrated in Section V a piece-wise constant parametrization can be assumed, with changes of value at the nodes τ 1 , .…”
Section: Nmpc Frameworkmentioning
confidence: 99%
“…To sum up, the proposed NMPC framework shows the following advantages: i) conversely to the numerical methods where a discretization of state, input and constraints before optimization is required [24], the PMP-based solution does not need the input parametrization anymore, resulting in a better accuracy in tracking the reference; ii) the PMP-based NMPC seems to perform an more efficient trade-off between computational complexity and final reference tracking with respect to the direct methods, making him suitable for on-line applications.…”
mentioning
confidence: 99%
“…The Kernel functions (K) used in the SVR for this paper are linear (19), radial basis function (RBF) (18), and polynomial (20). For an overall evaluation, the quadratic programming (L1QP) [84], iterative single data algorithm optimization (ISDA) [85], and sequential minimal optimization (SMO) [86] optimizers were used. For the initial comparison of ensemble models, the linear Kernel function and the L1QP optimizer were adopted.…”
Section: Ensemble Learning Modelsmentioning
confidence: 99%