2023
DOI: 10.3934/dcdsb.2022095
|View full text |Cite
|
Sign up to set email alerts
|

Quantify uncertainty by estimating the probability density function of the output of interest using MLMC based Bayes method

Abstract: <p style='text-indent:20px;'>In uncertainty quantification, the quantity of interest is usually the statistics of the space and/or time integration of system solution. In order to reduce the computational cost, a Bayes estimator based on multilevel Monte Carlo (MLMC) is introduced in this paper. The cumulative distribution function of the output of interest, that is, the expectation of the indicator function, is estimated by MLMC method instead of the classic Monte Carlo simulation. Then, combined with t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…One such technique is Latin hypercube sampling (McKay et al, 2000), which splits the distribution into n equal partitions (where n is the number of samples required), and a sample is then taken from each partition. This sampling approach has been shown to improve computational efficiency when used with both a standard Monte Carlo method (McKay et al, 2000) and with MLMC (Xiong et al, 2022). Another technique is evolutionary algorithms (Vikhar, 2016), which are optimisation algorithms inspired by biological evolution that start with an initial set of samples (population) and evolve towards an optimal set.…”
Section: Myrtle Beachmentioning
confidence: 99%
“…One such technique is Latin hypercube sampling (McKay et al, 2000), which splits the distribution into n equal partitions (where n is the number of samples required), and a sample is then taken from each partition. This sampling approach has been shown to improve computational efficiency when used with both a standard Monte Carlo method (McKay et al, 2000) and with MLMC (Xiong et al, 2022). Another technique is evolutionary algorithms (Vikhar, 2016), which are optimisation algorithms inspired by biological evolution that start with an initial set of samples (population) and evolve towards an optimal set.…”
Section: Myrtle Beachmentioning
confidence: 99%
“…One such technique is Latin hypercube sampling (McKay et al, 2000), which splits the distribution into n equal partitions (where n is the number of samples required), and a sample is then taken from each partition. This sampling approach has been shown to improve computational efficiency when used with both a standard Monte Carlo method (McKay et al, 2000) and with MLMC (Xiong et al, 2022). Another technique is evolutionary algorithms (Vikhar, 2016), which are optimisation algorithms inspired by biological evolution that start with an initial set of samples (population) and evolve towards an optimal set.…”
Section: Myrtle Beachmentioning
confidence: 99%