2016
DOI: 10.1007/s10915-016-0245-2
|View full text |Cite
|
Sign up to set email alerts
|

Superconvergent Two-Grid Methods for Elliptic Eigenvalue Problems

Abstract: Abstract. Some numerical algorithms for elliptic eigenvalue problems are proposed, analyzed, and numerically tested. The methods combine advantages of the two-grid algorithm [J. Xu and A. Zhou, Math. Comp, 70(2001) To reduce the computational cost of eigenvalue problems, Xu and Zhou introduced a two-grid discretization scheme [42]. Later on, similar ideas were applied to non self-adjoint eigenvalue problems [22] and semilinear elliptic eigenvalue problems [11]. Furthermore, it also has been generalized to thr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
6
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
8
1

Relationship

3
6

Authors

Journals

citations
Cited by 16 publications
(6 citation statements)
references
References 43 publications
0
6
0
Order By: Relevance
“…Remark 3.9. One of the most practical applications of gradient recovery techniques is to construct asymptotically exact a posteriori error estimators [1,3,19,31,41,42] for adaptive computational methods. Based on the recovery operator R h , one can define a local a posteriori error estimator on element T ∈ T h as…”
Section: Then the Estimate Follows By Thatmentioning
confidence: 99%
“…Remark 3.9. One of the most practical applications of gradient recovery techniques is to construct asymptotically exact a posteriori error estimators [1,3,19,31,41,42] for adaptive computational methods. Based on the recovery operator R h , one can define a local a posteriori error estimator on element T ∈ T h as…”
Section: Then the Estimate Follows By Thatmentioning
confidence: 99%
“…It can be used for mesh smoothing, a posteriori error estimate [12,20,22,23,25], and adaptive finite element method even with anisotropic meshes [2,9,11,16]. More recently, the gradient recovery technique was applied to improve eigenvalue approximation as well [8,14,15,18].…”
mentioning
confidence: 99%
“…We want to remark that lower bound of eigenvalue is very important in practice, and many efforts have been made to obtain eigenvalue approximation from below. The readers are referred to Armentano & Durán (2004), Guo et al (2016), Yang et al (2010) and Zhang et al (2007) for other ways to approximate eigenvalue from below. Our numerical experiments 2132 H. CHEN ET AL.…”
Section: Numerical Experimentsmentioning
confidence: 99%