2016
DOI: 10.1016/j.advengsoft.2016.09.001
|View full text |Cite
|
Sign up to set email alerts
|

GTApprox: Surrogate modeling for industrial design

Abstract: We describe GTApprox -a new tool for medium-scale surrogate modeling in industrial design. Compared to existing software, GTApprox brings several innovations: a few novel approximation algorithms, several advanced methods of automated model selection, novel options in the form of hints. We demonstrate the efficiency of GTApprox on a large collection of test problems. In addition, we describe several applications of GTApprox to real engineering problems.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 51 publications
(29 citation statements)
references
References 36 publications
0
29
0
Order By: Relevance
“…A total of 890807 data points were randomly selected for training and the remaining 46885 data points are used for testing. For this construction, the smart selection routine of Datadvance chose the high dimensional approximation [25], which is essentially a two-layer neural network. The relative mean distance measure (RDM) is 0.2102 and the natural logarithm of the magnification factor (ln(MAG)) is −0.0208.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…A total of 890807 data points were randomly selected for training and the remaining 46885 data points are used for testing. For this construction, the smart selection routine of Datadvance chose the high dimensional approximation [25], which is essentially a two-layer neural network. The relative mean distance measure (RDM) is 0.2102 and the natural logarithm of the magnification factor (ln(MAG)) is −0.0208.…”
Section: Resultsmentioning
confidence: 99%
“…It is noted that pSeven has a smart selection option for scanning through a set of algorithms to select the best model amongst the set. It performs for each training set a numerical optimization of the technique as well as its parameters [27,28] by minimizing the cross-validation error, see [25]. Among the algorithms scanned by pSeven are the following: ridge regression [29], stepwise regression [30], elastic net [31], Gaussian processes [32], sparse Gaussian processes [33,34], High Dimensional Approximation (HDA) [25,35], and High dimensional approximation combined with Gaussian processes (HDAGP) (this technique is related to artificial neural networks and, more specifically, to the two-layer perceptron with a non-linear activation function [35]).…”
Section: Regression Model Of U S (R τ τ ) For a Realistic Head Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Constructed model can be used to speed-up evaluations [57,2,36], for surrogate-based optimization [31,46], uncertainty quantification [42], sensitivity analysis [16,17,18] and adaptive design of experiments [19]. Maturity of this approach is further confirmed not only by numerous applications but also by availability of software packages that are dedicated to surrogate modeling and include Gaussian process regression-based approaches [8,35,1,10].…”
Section: Introductionmentioning
confidence: 99%
“…As the Cholesky factor for the updated model differs only in the last row we calculate (8) and (9) in O(n 2 ) operations. The total computational complexity is the sum of computational complexities of the Cholesky decomposition update and the posterior mean and variance recalculation, so for a Variable fidelity Gaussian process regression with a blackbox, representing the low fidelity function, the following theorem holds true.…”
Section: Variable Fidelity Gaussian Process Regression With a Low Fidmentioning
confidence: 99%