2022
DOI: 10.1021/acs.jctc.1c00813
|View full text |Cite
|
Sign up to set email alerts
|

Local Kernel Regression and Neural Network Approaches to the Conformational Landscapes of Oligopeptides

Abstract: The application of machine learning to theoretical chemistry has made it possible to combine the accuracy of quantum chemical energetics with the thorough sampling of finitetemperature fluctuations. To reach this goal, a diverse set of methods has been proposed, ranging from simple linear models to kernel regression and highly nonlinear neural networks. Here we apply two widely different approaches to the same, challenging problem: the sampling of the conformational landscape of polypeptides at finite temperat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
21
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(23 citation statements)
references
References 114 publications
2
21
0
Order By: Relevance
“…A more detailed analysis across different functional groups present in organocatalysts shows consistent improvement by the LKR−OMP correction (Figure 3C and Table S4 in Supporting Information), such that predictions can be made on highly complex and diverse data sets and the desired FESs can be reliably constructed. 79 4.2. Extrapolation.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…A more detailed analysis across different functional groups present in organocatalysts shows consistent improvement by the LKR−OMP correction (Figure 3C and Table S4 in Supporting Information), such that predictions can be made on highly complex and diverse data sets and the desired FESs can be reliably constructed. 79 4.2. Extrapolation.…”
Section: Resultsmentioning
confidence: 99%
“…Our primary objective is to obtain the FES of any large and flexible organic molecule at a hybrid DFT level of theory while simultaneously circumventing the computational cost imposed by ab initio MD. To address this, we choose to use the in-house developed LKR–OMP model, which allows a robust and accurate machine-learning potential (MLP) to be developed with reduced training time compared to conventional neural network potentials (i.e., Behler–Parrinello-type neural networks) . Subsequent analysis focuses on accomplishing this objective.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…While open questions undoubtedly exist surrounding interpolation vs. extrapolation as well as designing challenging out-of-sample test sets such as those containing new chemistry [26,28,[42][43][44][45][46][47][48][49], larger [9,50,51] and/or outlier [52] molecules/materials, or new sets appearing over time [53][54][55], this topic remains an ongoing community-wide discussion with no clear best practices, and will not be discussed further here.…”
Section: Data Splitsmentioning
confidence: 99%
“…Their advantages become more significant when molecular and dataset sizes are larger. [172][173][174][175][176][177][178][179] Also, earlier studies demonstrated that stacked generalization has a stronger predictive power than every single model. [89][90][91][92][93][94][95] In our model, each base learner generates a non-linear quantitative relationship between the CMD the optimal ω ML .…”
mentioning
confidence: 99%