It is well known that orthogonalization of column vectors in a rectangular matrix B with respect to the bilinear form induced by a nonsingular symmetric indefinite matrix A can be eventually seen as its factorization B = QR that is equivalent to the Cholesky-like factorization in the form B T AB = R T ΩR, where R is upper triangular and Ω is a signature matrix. Under the assumption of nonzero principal minors of the matrix M = B T AB we give bounds for the conditioning of the triangular factor R in terms of extremal singular values of M and of only those principal submatrices of M where there is a change of sign in Ω. Using these results we study the numerical behavior of two types of orthogonalization schemes and we give the worst-case bounds for quantities computed in finite precision arithmetic. In particular, we analyze the implementation based on the Cholesky-like factorization of M and the Gram-Schmidt process with respect to the bilinear form induced by the matrix A. To improve the accuracy of computed results we consider also the Gram-Schmidt process with reorthogonalization and show that its behavior is similar to the scheme based on the Cholesky-like factorization with one step of iterative refinement.
Introduction.For a real symmetric (in general indefinite) nonsingular matrix A ∈ R m,m and for a full-column rank matrix B ∈ R m,n (m ≥ n) we look for a factorization B = QR, where Q ∈ R m,n is so-called (A, Ω)-orthogonal, i.e., its columns are mutually orthogonal with respect to the bilinear form induced by the matrix A, with Q T AQ = Ω ∈ R n,n being a signature matrix Ω ∈ diag(±1), and where R ∈ R n,n is upper triangular with positive diagonal elements. Note that the full-column rank condition of the matrix B is not enough for the existence of the factors Q and R such that Q is (A, Ω)-orthogonal and R is upper triangular with positive diagonal entries. It is also easy to see that if the factorization B = QR exists, it can be regarded as an implicit Cholesky-like factorization of the symmetric indefinite matrix M = B T AB = R T ΩR (without its explicit computation), delivering the same upper triangular factor R. Conversely, given the Cholesky-like factorization of M , the (A, Ω)-orthogonal factor Q can be then recovered as Q = BR −1 . Such problems appear explicitly [15] or implicitly in many applications such as eigenvalue problems, matrix pencils and structure-preserving algorithms [21,25], saddle-point problems, and optimization with interior-point methods [13,36,29] or indefinite least squares problems [4,9,23,24].