2021
DOI: 10.1021/acs.jctc.1c00292
|View full text |Cite
|
Sign up to set email alerts
|

Iterative Power Algorithm for Global Optimization with Quantics Tensor Trains

Abstract: Optimization algorithms play a central role in chemistry since optimization is the computational keystone of most molecular and electronic structure calculations. Herein, we introduce the iterative power algorithm (IPA) for global optimization and a formal proof of convergence for both discrete and continuous global search problems, which is essential for applications in chemistry such as molecular geometry optimization. IPA implements the power iteration method in quantics tensor train (QTT) representations. … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

6
2

Authors

Journals

citations
Cited by 14 publications
(17 citation statements)
references
References 73 publications
0
17
0
Order By: Relevance
“…have been defined according to the model presented in ref 54 in place of the standard harmonic fit to the electronic ground potential energy surface at the equilibrium geometry in order to facilitate direct comparison to literature results. The wavepacket then evolves 37,72,73 The TT format of an arbitrary d-dimensional tensor…”
Section: Methodsmentioning
confidence: 99%
“…have been defined according to the model presented in ref 54 in place of the standard harmonic fit to the electronic ground potential energy surface at the equilibrium geometry in order to facilitate direct comparison to literature results. The wavepacket then evolves 37,72,73 The TT format of an arbitrary d-dimensional tensor…”
Section: Methodsmentioning
confidence: 99%
“…A modified version of GRAPE, based on a Krylov approximation of the matrix exponential, allows for dealing with high-dimensional Hilbert spaces [355]. A global optimization algorithm with quantics tensor trains has been proposed [524]. Improved convergence is obtained when including second order derivative information.…”
Section: Numerical Approachmentioning
confidence: 99%
“…TT-SOKSL relies on the tensor-train (TT) format, 61-64 also called matrix product states (MPS) with open boundary conditions, [65][66][67][68][69] recently explored for the development of methods for quantum dynamics and global optimization. 35,70,71 The TT format of an arbitrary d-dimensional tensor X ∈ C n 1 ×...×n d involves a train-like matrix product of d 3-mode tensors X i ∈ C r i−1 ×n i ×r i with r 0 = r d = 1, so any element X(j 1 , ..., j d ) of X can be evaluated, as follows: 61…”
Section: Tensor-train Decompositionmentioning
confidence: 99%