2022
DOI: 10.1137/21m1453542
|View full text |Cite
|
Sign up to set email alerts
|

Strongly Minimal Self-Conjugate Linearizations for Polynomial and Rational Matrices

Abstract: We prove that we can always construct strongly minimal linearizations of an arbitrary rational matrix from its Laurent expansion around the point at infinity, which happens to be the case for polynomial matrices expressed in the monomial basis. If the rational matrix has a particular self-conjugate structure we show how to construct strongly minimal linearizations that preserve it. The structures that are considered are the Hermitian and skew-Hermitian rational matrices with respect to the real line, and the p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 48 publications
0
3
0
Order By: Relevance
“…Thus, by Theorem 1.2, L(λ) contains the information about finite poles and zeros of R(λ), and is a linearization of R(λ) in the sense of [4,Definition 3.2]. More information about different definitions of linearizations of rational matrices and how to construct linear polynomial system matrices that also preserve the pole and zero information at infinity, i.e., the pole and zero information at 0 of R(1/λ), can be found, for instance, in [4,17,18,19].…”
Section: Reversal Of Extended Block Kronecker Linearizations As Rosen...mentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, by Theorem 1.2, L(λ) contains the information about finite poles and zeros of R(λ), and is a linearization of R(λ) in the sense of [4,Definition 3.2]. More information about different definitions of linearizations of rational matrices and how to construct linear polynomial system matrices that also preserve the pole and zero information at infinity, i.e., the pole and zero information at 0 of R(1/λ), can be found, for instance, in [4,17,18,19].…”
Section: Reversal Of Extended Block Kronecker Linearizations As Rosen...mentioning
confidence: 99%
“…In that case we say that rev 1 L(λ) and diag(rev ℓ P (λ), I s ) are equivalent at 0. In this paper the terms linearization and strong linearization of a matrix polynomial always refer to the Gohberg, Lancaster and Rodman's definitions in (2) and (3), though other non-equivalent definitions of linearizations are available in the literature [18,19]. In the last two decades, definitions (2) and (3) have been very influential in many families of linearizations that have been developed with the goal of solving unstructured and structured PEPs (see, for instance, [3,5,7,10,11,15,25,26,27,28,30,36] among many other references on this topic).…”
Section: Introductionmentioning
confidence: 99%
“…One of the main applications of GCRD and GCLD extraction (for F = C or R) is the reduction of polynomial system quadruples {A(λ), B(λ), C(λ), D(λ)}, introduced by Rosenbrock [22] in linear system theory, and recently become of interest also in the context of nonlinear eigenvalue problems, see e.g. [6,8,19] and the references therein. Such system quadruples are a minimal representation of the (rational) transfer function R(λ) := C(λ)A(λ) −1 B(λ) + D(λ) provided the pairs {−A(λ), C(λ)} and {−A(λ), B(λ)} are right and left coprime, respectively.…”
Section: Introductionmentioning
confidence: 99%