Quantum Chemistry in the Age of Machine Learning 2023
DOI: 10.1016/b978-0-323-90049-2.00009-3
|View full text |Cite
|
Sign up to set email alerts
|

Kernel methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 51 publications
(87 reference statements)
0
8
0
Order By: Relevance
“…The user can choose from one of the implemented kernel functions, including the linear, ,, Gaussian, ,, exponential, ,, Laplacian, ,, and Matérn , as well as periodic ,, and decaying periodic ,, functions, which are summarized in Table . These kernel functions k ( x , x j ; h ) are key components required to solve the KRR problem of finding the regression coefficients α of the approximating function f̂ ( x ; h ) of the input vector x : , ( x ; h ) = j = 1 N tr α j k ( x , boldx j ; h ) …”
Section: Models and Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The user can choose from one of the implemented kernel functions, including the linear, ,, Gaussian, ,, exponential, ,, Laplacian, ,, and Matérn , as well as periodic ,, and decaying periodic ,, functions, which are summarized in Table . These kernel functions k ( x , x j ; h ) are key components required to solve the KRR problem of finding the regression coefficients α of the approximating function f̂ ( x ; h ) of the input vector x : , ( x ; h ) = j = 1 N tr α j k ( x , boldx j ; h ) …”
Section: Models and Methodsmentioning
confidence: 99%
“…MLatom also provides the flexibility of training custom models based on kernel ridge regression (KRR) for a given set of input vectors x or XYZ coordinates and any labels y . , If XYZ coordinates are provided, they can be transformed in one of the several supported descriptors (e.g., inverse internuclear distances and their version normalized relative to the equilibrium structure (RE) and the Coulomb matrix). The user can choose from one of the implemented kernel functions, including the linear, ,, Gaussian, ,, exponential, ,, Laplacian, ,, and Matérn , as well as periodic ,, and decaying periodic ,, functions, which are summarized in Table . These kernel functions k ( x , x j ; h ) are key components required to solve the KRR problem of finding the regression coefficients α of the approximating function f̂ ( x ; h ) of the input vector x : , ( x ; h ) = j = 1 N tr α j k ( x , boldx j ; h ) …”
Section: Models and Methodsmentioning
confidence: 99%
“…Kernel ridge regression (KRR) belongs to the kernel methods and is directly related to the Gaussian process regression (GPR). The fitting functions of KRR and GPR are identical, but GPR is derived from the perspective of the Bayesian approach, which provides variance (can be useful for determining uncertainty) and the hyperparameter optimization is often done differently in GPR (where, often, log marginal likelihood is optimized). , For each input vector x i , the prediction of KRR is given as , .25ex2ex f false( boldx i false) = y prior + j = 1 N tr α j k ( x i , x j ) where N tr is the number of training points, α j is the regression coefficient and k ( x i , x j ) is the kernel function that will be discussed below, and y prior is, following related Gaussian process regression terminology, a prior function. In the MLatom implementation, y prior is simply a constant that is by default set to zero or can be set to the mean of the reference values or any other user-defined value.…”
Section: Methodsmentioning
confidence: 99%
“…The GEK approacha GPR variantis an ML method that has been applied with significant success to, for example, molecular structure optimizations. , In this report, such a GEK implementation especially modified for the case of SCF orbital optimizations is presented. The size of the parameter space, N SCF , however, is a significant problem for an efficient implementation, as reported by Ritterhoff, who demonstrated that the GEK procedure applied to SCF orbital optimization can have competitive iteration counts, while the timings are very unfavorable due to the N 3 scaling of the procedure.…”
Section: Theorymentioning
confidence: 99%