The use of gradient descent methods for optimizing k-eigenvalue nuclear systems has been shown to be useful in the past, but the use of k-eigenvalue gradients have proved computationally challenging due to their stochastic nature. ADAM is a gradient descent method that accounts for gradients with a stochastic nature. This analysis uses challenge problems constructed to verify if ADAM is a suitable tool to optimize k-eigenvalue nuclear systems. ADAM is able to successfully optimize nuclear systems using the gradients of k-eigenvalue problems despite their stochastic nature and uncertainty. Furthermore, it is clearly demonstrated that low-compute time, high-variance estimates of the gradient lead to better performance in the optimization challenge problems tested here.
The use of gradient descent methods for optimizing k-eigenvalue nuclear system has been shown to be useful in the past, but the k-eigenvalue gradients have proved challenging due to their stochastic nature and uncertainty. ADAM is a gradient descent method that accounts for gradients with a stochastic nature. This analysis uses challenge problems constructed to verify if ADAM is a suitable tool to optimize k-eigenvalue systems. ADAM is able to successfully optimize nuclear systems using the gradients of k-eigenvalue problems despite their stochastic nature and uncertainty.
Genetic algorithms (GA) are used to optimize the Fast Neutron Source (FNS) core fuel loading to maximize a multiobjective function. The FNS has 150 material locations that can be loaded with one of three different materials resulting in over 3E+71 combinations. The individual designs are evaluated with computationally intensive calls to MCNP. To speed up the optimization, convolutional neural networks (CNN) are trained as surrogate models and used to produce better performing candidates that will meet the design constraints before they are sent to the costly MCNP evaluations. A major hurdle in training neural networks of all kinds is the availability of robust training data. In this application, we use the data produced by the GA as training data for the surrogate models which combine geometric features of the system to predict the objectives and constraint objectives. Utilizing the surrogate models, the accelerated algorithm produced more viable designs that significantly improved the objective function utilizing the same computational resources.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.