2022
DOI: 10.1038/s41598-022-13007-5
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of physical quantities in the autoencoder latent space

Abstract: We propose a strategy for optimizing physical quantities based on exploring in the latent space of a variational autoencoder (VAE). We train a VAE model using various spin configurations formed on a two-dimensional chiral magnetic system. Three optimization algorithms are used to explore the latent space of the trained VAE. The first algorithm, the single-code modification algorithm, is designed for improving the local energetic stability of spin configurations to generate physically plausible spin states. The… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 31 publications
0
1
0
Order By: Relevance
“…Wu et al proposed a new genetic algorithm which can divide the initial largescale problem into several small-scale subproblems, then use a hybrid search strategy to solve the subproblem respectively [12]. Minh et al proposed a strategy for optimizing physical quantities based on exploring the latent space of a variational autoencoder (VAE) [13]. This algorithm trains a VAE using an existing dataset whose elements have good performance for fitness functions.…”
Section: A Genetic Algorithms and Acceleration Strategiesmentioning
confidence: 99%
“…Wu et al proposed a new genetic algorithm which can divide the initial largescale problem into several small-scale subproblems, then use a hybrid search strategy to solve the subproblem respectively [12]. Minh et al proposed a strategy for optimizing physical quantities based on exploring the latent space of a variational autoencoder (VAE) [13]. This algorithm trains a VAE using an existing dataset whose elements have good performance for fitness functions.…”
Section: A Genetic Algorithms and Acceleration Strategiesmentioning
confidence: 99%
“…Generally, a hyperparameter is a parameter which controls the learning process of the ML models, and the hyperparameter tuning (or optimization) is the problem of choosing a set of optimal values for those hyperparameters to optimize the learning process. Extensive effort has been dedicated towards optimal tuning of ML models, with different optimization techniques such as gradient based method, genetic algorithm (GA), Bayesian optimization (BO) etc [13][14][15][16][17][18][19]. In the process of tuning ML models where the training cost is computationally high, any exhaustive or manual parameter space search is a highly non-desirable approach.…”
Section: Introductionmentioning
confidence: 99%