2017
DOI: 10.1016/j.jcp.2017.02.005
|View full text |Cite
|
Sign up to set email alerts
|

Orbital minimization method with ℓ1 regularization

Abstract: We consider a modification of the OMM energy functional which contains an ℓ 1 penalty term in order to find a sparse representation of the low-lying eigenspace of self-adjoint operators. We analyze the local minima of the modified functional as well as the convergence of the modified functional to the original functional. Algorithms combining soft thresholding with gradient descent are proposed for minimizing this new functional. Numerical tests validate our approach. As an added bonus, we also prove the unant… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

4
2

Authors

Journals

citations
Cited by 10 publications
(12 citation statements)
references
References 28 publications
0
12
0
Order By: Relevance
“…Combine (26) with the equivalence of different norms in finite dimensional vector space, we obtain for any p ě q ě 1 gpyq ě gpxq `∇gpxq J py ´xq `µp 2 y ´x 2 p ě gpxq `∇gpxq J py ´xq `n2 {p´2{q µ p 2 y ´x 2 q , @x, y P S.…”
Section: Local Convergence Of Stochastic Coordinate-wise Descent Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Combine (26) with the equivalence of different norms in finite dimensional vector space, we obtain for any p ě q ě 1 gpyq ě gpxq `∇gpxq J py ´xq `µp 2 y ´x 2 p ě gpxq `∇gpxq J py ´xq `n2 {p´2{q µ p 2 y ´x 2 q , @x, y P S.…”
Section: Local Convergence Of Stochastic Coordinate-wise Descent Methodsmentioning
confidence: 99%
“…Leading eigenvalue(s) problems appear in a wide range of applications, including principal component analysis (PCA), spectral clustering, dimension reduction, electronic structure calculation, quantum many-body problems, etc. As a result, many methods have been developed to address the leading eigenvalue(s) problems, e.g., power method, Lanczos algorithm [9,17], randomized SVD [11], (Jacobi-)Davidson algorithm [6,37], local optimal block preconditioned conjugate gradient (LOBPCG) [16], projected preconditioned conjugate gradient (PPCG) [45], orbital minimization method (OMM) [5,26], Gauss-Newton algorithm [25], etc. However, most of those traditional iterative methods apply the matrix A every iteration and require many iterations before convergence, where the iteration number usually depends on the condition number of A or the leading eigengap of A.…”
mentioning
confidence: 99%
“…However, with truncation, the optimization procedure might get stuck. Strategies to alleviate the problem have been proposed, for example in Kim, Mauri and Galli (1995), Tsuchida (2007), Gao and E (2009) and Lu and Thicke (2017b). The OMM is used in the SIESTA package (Soler et al 2002) to achieve linear scaling.…”
Section: Orbital Minimization Methodsmentioning
confidence: 99%
“…1993, Pfrommer, Demmel and Simon 1999). In fact, somewhat surprisingly, while the objective function (5.4) is non-convex, it is proved in Lu and Thicke (2017 b ) that every local minimum of (5.4) is in fact also a global minimum. Thus it suffices for an optimization algorithm to converge locally.…”
Section: Evaluation Of the Kohn–sham Map: Semi-local Functionalmentioning
confidence: 99%
“…On the other hand, the objective function of qOMM, by itself, is non-convex but has no spurious local minima. 27,35,36 When the parameterization of the ansatz circuits is taken into consideration, the energy landscape of qOMM is nonconvex and could have spurious local minima. For non-convex energy landscapes, usual optimizers, including L-BFGS-B, are efficient in a neighborhood of local minima whereas, around strict saddle points, without second-order information, the optimizers could stall there for a long time.…”
Section: Lihmentioning
confidence: 99%