In this article, an algorithm guaranteeing asymptotic stability for parametric model order reduction by matrix interpolation is proposed for the general class of high-dimensional linear time-invariant systems. In the first step, the system matrices of the high-dimensional parameter-dependent system are computed for a set of parameter vectors. The local highorder systems are reduced by a projection-based reduction method and stabilized, if necessary. Secondly, the low-order systems are transformed into a consistent set of generalized coordinates. Thirdly, a new procedure using semidefinite programming is applied to the low-order systems, converting them into strictly dissipative form. Finally, an asymptotically stable reduced order model can be calculated for any new parameter vector of interest by interpolating the system matrices of the local loworder models. We show that this approach works without any limiting conditions concerning the structure of the large-scale model and is suitable for real-time applications. The method is illustrated by two numerical examples.
ARTICLE HISTORY