1999
DOI: 10.1006/jmaa.1999.6320
|View full text |Cite
|
Sign up to set email alerts
|

First- and Second-Order Epi-Differentiability in Eigenvalue Optimization

Abstract: The subject of this work is the first-and second-order sensitivity analysis of some spectral functions which are essential in eigenvalue optimization by the way of epi-differentiability. We show that the sum of the m largest eigenvalues of a real symmetric matrix is twice epi-differentiable and we derive an explicit expression of its second-order epi-derivative. We also prove that the mth largest eigenvalue function is twice epi-differentiable if and only if it ranks first in a group of equal eigenvalues. Fina… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
11
0

Year Published

2001
2001
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 18 publications
0
11
0
Order By: Relevance
“…There are many examples where a property of the spectral function F is actually equivalent to the corresponding property of the underlying symmetric function f . Among them are first-order differentiability [9], convexity [8], generalized first-order differentiability [9,10], analyticity [26], and various second-order properties [25,24,23]. It is also worth mentioning the "Chevalley restriction theorem," which in this context identifies spectral functions that are polynomials with symmetric polynomials of the eigenvalues.…”
Section: Introductionmentioning
confidence: 99%
“…There are many examples where a property of the spectral function F is actually equivalent to the corresponding property of the underlying symmetric function f . Among them are first-order differentiability [9], convexity [8], generalized first-order differentiability [9,10], analyticity [26], and various second-order properties [25,24,23]. It is also worth mentioning the "Chevalley restriction theorem," which in this context identifies spectral functions that are polynomials with symmetric polynomials of the eigenvalues.…”
Section: Introductionmentioning
confidence: 99%
“…This condition clearly implies condition (i) of corollary 4, and moreover excludes many important examples which are covered by our corollary 4. Of particular note is the maximum eigenvalue function studied in [14], which is shown there to be convex and twice epi-differentiable, but with second-order epi-derivative that does not satisfy (7).…”
Section: Corollarymentioning
confidence: 99%
“…Torki [14] goes on to derive a formula for the second-order epi-derivative of convex-C 2 composite functions formed by composing the maximum eigenvalue function with C 2 perturbations of symmetric matrices. Part of Torki's proof of this result is essentially the verification of the inequality (ii) in our corollary 4, so his results for such convex-C 2 composite functions can be obtained directly from our corollary 4 as we show in the following example.…”
Section: Corollarymentioning
confidence: 99%
See 1 more Smart Citation
“…Friedland et al proposed the inverse eigenvalue problems [8]; Cullum et al solved the graph partitioning problems [6]; low rank matrix optimization can also be considered as an arbitrary eigenvalue problem [9]; Polak and Wardi presented structural optimization problems [28,29]. Torki had studied the arbitrary eigenvalue function by the way of epi-differentiability [35,36]. In 1995 Hiriart-Urruty and Ye introduced the arbitrary eigenvalue [12], where they presented the first-order sensitivity analysis of all eigenvalues of a symmetric matrix and theory result.…”
mentioning
confidence: 99%