In this work we explore the properties which make many real-life global optimization problems extremely difficult to handle, and some of the common techniques used in literature to address them. We then introduce a general optimization management tool called GloMPO (Globally Managed Parallel Optimization) to help address some of the challenges faced by practitioners. GloMPO manages and shares information between traditional optimization algorithms run in parallel. We hope that GloMPO will be a flexible framework which allows for customization and hybridization of various optimization ideas, while also providing a substitute for human interventions and decisions which are a common feature of optimization processes of hard problems. GloMPO is shown to produce lower minima than traditional optimization approaches on global optimization test functions, the Lennard-Jones cluster problem, and ReaxFF reparameterizations. The novel feature of forced optimizer termination was shown to find better minima than normal optimization. GloMPO is also shown to provide qualitative benefits such a identifying degenerate minima, and providing a standardized interface and workflow manager.
We apply a global sensitivity method, the Hilbert− Schmidt independence criterion (HSIC), to the reparametrization of a Zn/S/H ReaxFF force field to identify the most appropriate parameters for reparametrization. Parameter selection remains a challenge in this context, as high-dimensional optimizations are prone to overfitting and take a long time but selecting too few parameters leads to poor-quality force fields. We show that the HSIC correctly and quickly identifies the most sensitive parameters and that optimizations done using a small number of sensitive parameters outperform those done using a higher-dimensional reasonable-user parameter selection. Optimizations using only sensitive parameters (1) converge faster, ( 2) have loss values comparable to those found with the naive selection, (3) have similar accuracy in validation tests, and (4) do not suffer from problems of overfitting. We demonstrate that an HSIC global sensitivity is a cheap optimization preprocessing step that has both qualitative and quantitative benefits which can substantially simplify and speed up ReaxFF reparametrizations.
Based on experimental
data of both batch and continuous enzyme-catalyzed
kinetic resolutions of (±)-
trans
-1,2-cyclohexanediol
in supercritical carbon dioxide, kinetic models of increasing complexity
were developed to explore the strengths and drawbacks of various modeling
approaches. The simplest, first-order model proved to be a good fit
for the batch experimental data in regions of high reagent concentrations
but failed elsewhere. A more complex system that closely follows the
true mechanism was able to fit the full range of experimental data,
find constant reaction rate coefficients, and was successfully used
to predict the results of the same reaction run continuously in a
packed bed reactor. Care must be taken when working with such models,
however, to avoid problems of overfitting; a more complex model is
not always more accurate. This work may serve as an example for more
rigorous reaction modeling and reactor design in the future.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.