2020
DOI: 10.1007/s10092-020-00363-9
|View full text |Cite
|
Sign up to set email alerts
|

Computing several eigenvalues of nonlinear eigenvalue problems by selection

Abstract: Computing more than one eigenvalue for (large sparse) one-parameter polynomial and general nonlinear eigenproblems, as well as for multiparameter linear and nonlinear eigenproblems, is a much harder task than for standard eigenvalue problems. We present simple but efficient selection methods based on divided differences to do this. In contrast to locking techniques, it is not necessary to keep converged eigenvectors in the search space, so that the entire search space may be devoted to new information. The tec… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…For the polynomial plasma drift we repeated the experiment twice by setting the stopping criteria in (2.2) to a tolerance of 1.0e − 4, and the maximum number of iterations to 450, respectively. This is a very challenging problem for any eigensolver [25] with several eigenvalues of high multiplicity and/or clustered around zero. In this case the estimated backward error is not so significant since our measure of backward stability assumes that we are at convergence and we have identified an invariant subspace.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For the polynomial plasma drift we repeated the experiment twice by setting the stopping criteria in (2.2) to a tolerance of 1.0e − 4, and the maximum number of iterations to 450, respectively. This is a very challenging problem for any eigensolver [25] with several eigenvalues of high multiplicity and/or clustered around zero. In this case the estimated backward error is not so significant since our measure of backward stability assumes that we are at convergence and we have identified an invariant subspace.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…In this case the estimated backward error is not so significant since our measure of backward stability assumes that we are at convergence and we have identified an invariant subspace. In [25] the authors proposed a variation of the Jacobi-Davidson method for computing several eigenpairs of the polynomial eigenvalue problem. For the plasma drift problem they fixed a residual threshold of 1.0e − 2 and within 200 iterations they were able to compute the approximations of the 19 eigenvalues closer to the origin.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…We consider a BVP-eigenvalue problem, which we separate into two domains in such a way that it leads to a two-parameter eigenvalue problem. Similar techniques and analysis are found in, e.g., [9], [3,Chapter 2], and [19,Experiment 4], where it is common to force the solution to have roots within the considered interval. with a wavenumber \kappa which is discontinuous in one part of the domain and smooth in another, as in Figure 5.6.…”
Section: Domain Decomposition Examplementioning
confidence: 99%
“…For the plasma_drift polynomial, the backward errors on the pencil reflect the termination of the algorithm with the adopted criteria, and the invariant subspace is found with an adequate accuracy. The plasma_drift is a very challenging problem for any eigensolver [26], due to several eigenvalues of high multiplicity and/or clustered around zero. In [26] the authors proposed a variation of the Jacobi-Davidson method for computing several eigenpairs of the polynomial eigenvalue problem.…”
Section: Matrix Polynomialsmentioning
confidence: 99%
“…The plasma_drift is a very challenging problem for any eigensolver [26], due to several eigenvalues of high multiplicity and/or clustered around zero. In [26] the authors proposed a variation of the Jacobi-Davidson method for computing several eigenpairs of the polynomial eigenvalue problem. For this problem, they ran their algorithm with a residual threshold of 1.0e−2 and within 200 iterations they were able to compute the approximations of the 19 eigenvalues closer to the origin.…”
Section: Matrix Polynomialsmentioning
confidence: 99%