2022
DOI: 10.1063/5.0102423
|View full text |Cite
|
Sign up to set email alerts
|

Supervised learning and the finite-temperature string method for computing committor functions and reaction rates

Abstract: A central object in the computational studies of rare events is the committor function. Though costly to compute, the committor function encodes complete mechanistic information of the processes involving rare events, including reaction rates and transition-state ensembles. Under the framework of transition path theory, Rotskoff et al. [ Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference, Proceedings of Machine Learning Research (PLMR, 2022), Vol. 145, pp. 757–780] proposes an algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 58 publications
0
5
0
Order By: Relevance
“…3,[14][15][16] Learning this high-dimensional function has attracted interest from a diversity of fields, and significant advances have been made through methods that employ importance sampling and machine learning. [16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31][32][33][34][35] Some notable approaches have leveraged the confinement of the transition region to compute it using string methods, 16,17,36 coarse-grained the phase-space to approximate it through diffusion maps, 19,28,37,38 and parameterized neural-networks by either fitting the committor directly 18,21,34 or solving the variational form of the steadystate backward Kolmogorov equation 22 by combining it with importance sampling methods. [23][24][25] While the learning procedures applied previously have been successful in fitting highdimensional representations of the reaction coordinate or committors, their nonlinearity has largely resulted in a difficulty in interpreting the relative importance of physically distinct descriptors and converting those descriptors into a robust measure of the rate.…”
Section: Introductionmentioning
confidence: 99%
“…3,[14][15][16] Learning this high-dimensional function has attracted interest from a diversity of fields, and significant advances have been made through methods that employ importance sampling and machine learning. [16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31][32][33][34][35] Some notable approaches have leveraged the confinement of the transition region to compute it using string methods, 16,17,36 coarse-grained the phase-space to approximate it through diffusion maps, 19,28,37,38 and parameterized neural-networks by either fitting the committor directly 18,21,34 or solving the variational form of the steadystate backward Kolmogorov equation 22 by combining it with importance sampling methods. [23][24][25] While the learning procedures applied previously have been successful in fitting highdimensional representations of the reaction coordinate or committors, their nonlinearity has largely resulted in a difficulty in interpreting the relative importance of physically distinct descriptors and converting those descriptors into a robust measure of the rate.…”
Section: Introductionmentioning
confidence: 99%
“…Once we obtain configurations within each reaction channels, we train a feed-forward neural network (FNN) to solve the backward-Kolmogorov equation (BKE). While a range of methods have been proposed to approximate a committor using an FNN, the method demonstrated in this work is unique in that it solves the exact BKE rather than the variational form [13][14][15][16] or the Feynman-Kac form [17,18]. This form of optimization is more accurate, however similar to other methods, it is strongly sensitive to the configurations that it is trained on.…”
Section: Introductionmentioning
confidence: 99%
“…The high-dimensionality of this problem poses difficulties in its inference, however a multitude of methods have been developed over the past decade that have made the computation of this function tractable. Some notable examples include the string method [5,6], diffusion maps (DMs) [8][9][10][11] and neural networks (NNs) [12][13][14][15][16][17][18][19][20][21].…”
Section: Introductionmentioning
confidence: 99%
“…3,[14][15][16] Learning this high dimensional function has attracted interest from a diversity of fields, and significant advancements has been made through methods that employ importance sampling and machine learning. [16][17][18][19][20][21][22][23][24][25][26][27][28] Some notable approaches have leveraged the confinement of the transition region to compute it using string methods, 16,17,29 coarse-grained the phase-space to approximate it through diffusion maps, 19,28,30,31 and parameterized neural-networks by either fitting the committor directly 18,21 or solving the variational form of the steady-state backward Kolmogorov equation 22 by combining it with importance sampling methods. [23][24][25] While the learning procedures applied previously have been successful in fitting high dimensional representations of the reaction coordinate or committors, their nonlinearity has largely resulted in a difficulty in interpreting the relative importance of physically distinct descriptors and converting those descriptors into a robust measure of the rate.…”
Section: Introductionmentioning
confidence: 99%