Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Calabi-Yau fourfolds may be constructed as hypersurfaces in weighted projective spaces of complex dimension five defined via weight systems of six weights. In this work, neural networks were implemented to learn the Calabi-Yau Hodge numbers from the weight systems, where gradient saliency and symbolic regression then inspired a truncation of the Landau-Ginzburg model formula for the Hodge numbers of any dimensional Calabi-Yau constructed in this way. The approximation always provides a tight lower bound, is shown to be dramatically quicker to compute (with computation times reduced by up to 4 orders of magnitude), and gives remarkably accurate results for systems with large weights. Additionally, complementary datasets of weight systems satisfying the necessary but insufficient conditions for transversality were constructed, including considerations of the interior point, reflexivity, and intradivisibility properties, overall producing a classification of this weight system landscape, further confirmed with machine learning methods. Using the knowledge of this classification and the properties of the presented approximation, a novel dataset of transverse weight systems consisting of seven weights was generated for a sum of weights ≤200, producing a new database of Calabi-Yau fivefolds, with their respective topological properties computed. Furthermore, an equivalent database of candidate Calabi-Yau sixfolds was generated with approximated Hodge numbers. Published by the American Physical Society 2024
Calabi-Yau fourfolds may be constructed as hypersurfaces in weighted projective spaces of complex dimension five defined via weight systems of six weights. In this work, neural networks were implemented to learn the Calabi-Yau Hodge numbers from the weight systems, where gradient saliency and symbolic regression then inspired a truncation of the Landau-Ginzburg model formula for the Hodge numbers of any dimensional Calabi-Yau constructed in this way. The approximation always provides a tight lower bound, is shown to be dramatically quicker to compute (with computation times reduced by up to 4 orders of magnitude), and gives remarkably accurate results for systems with large weights. Additionally, complementary datasets of weight systems satisfying the necessary but insufficient conditions for transversality were constructed, including considerations of the interior point, reflexivity, and intradivisibility properties, overall producing a classification of this weight system landscape, further confirmed with machine learning methods. Using the knowledge of this classification and the properties of the presented approximation, a novel dataset of transverse weight systems consisting of seven weights was generated for a sum of weights ≤200, producing a new database of Calabi-Yau fivefolds, with their respective topological properties computed. Furthermore, an equivalent database of candidate Calabi-Yau sixfolds was generated with approximated Hodge numbers. Published by the American Physical Society 2024
Gaussian process regression, kernel support vector regression, the random forest, extreme gradient boosting, and the generalized linear model algorithms are applied to data of complete intersection Calabi-Yau threefolds. It is shown that Gaussian process regression is the most suitable for learning the Hodge number h2,1 in terms of h1,1. The performance of this regression algorithm is such that the Pearson correlation coefficient for the validation set is R2=0.9999999995 with a root mean square error RMSE=0.0002895011. As for the train set, these two parameters are as follows: R2=0.9999999994 and RMSE=0.0002854348. The training error and the cross-validation error of this regression are 1×10−9 and 1.28×10−7, respectively. Learning the Hodge number h1,1 in terms of h2,1 yields R2=1.000000 and RMSE=7.395731×10−5 for the validation set of the Gaussian process regression. Published by the American Physical Society 2024
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.