2021
DOI: 10.1016/j.asoc.2021.107909
|View full text |Cite
|
Sign up to set email alerts
|

MO-PaDGAN: Reparameterizing Engineering Designs for augmented multi-objective optimization

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 28 publications
(17 citation statements)
references
References 24 publications
0
17
0
Order By: Relevance
“…Many different approaches have attempted to improve the performance of low-fidelity surrogate models. For example, self-supervised data augmentation [79] uses the DGM itself to generate samples in sparse regions (through optimization [166] or conditioning [79]) which then can be used to train the low-fidelity models to perform more accurately and in turn improve methods that rely on them for design generation. Multi-fidelity modeling is another promising approach, which involves generating surrogate models that augment a few costly high-fidelity samples with low-fidelity samples to attain higher fidelity surrogates with minimal expense.…”
Section: Design Performance Evaluationmentioning
confidence: 99%
See 1 more Smart Citation
“…Many different approaches have attempted to improve the performance of low-fidelity surrogate models. For example, self-supervised data augmentation [79] uses the DGM itself to generate samples in sparse regions (through optimization [166] or conditioning [79]) which then can be used to train the low-fidelity models to perform more accurately and in turn improve methods that rely on them for design generation. Multi-fidelity modeling is another promising approach, which involves generating surrogate models that augment a few costly high-fidelity samples with low-fidelity samples to attain higher fidelity surrogates with minimal expense.…”
Section: Design Performance Evaluationmentioning
confidence: 99%
“…Several recent works [166,19,160] have proposed methods to encourage creativity and novelty in DGMs. In their framework, CreativeGAN, Nobari et al, focus on identifying novelty and guiding DGMs towards such behaviour by directly introducing novel features into typical designs, thereby expanding the design space and novelty of the DGM's data.…”
Section: Creativity and Noveltymentioning
confidence: 99%
“…They, therefore take the approach of maximizing entropy in the generated samples. Others have shown that simply promoting diversity in GANs will allow GANs to fill in data distribution gaps and even deviate slightly from the original distribution [10,9]. In this work we take a different approach and promote self creativity in GANs.…”
Section: The Creativity Problem In Gansmentioning
confidence: 99%
“…To overcome this problem, data-driven methods such as generative adversarial networks (GANs) [7] and variational autoencoders (VAEs) [8], have been employed in many design synthesis problems [9,10,11,12,13,14,15]. GANs and VAEs are generally capable of learning complex distributions of existing designs and even considering performance and quality evaluation when generating new designs [9,10]. They allow for learning an underlying low-dimensional latent space that can represent the existing designs.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation