Abstract:In stochastic process design and planning optimization problems, the expected value of the objective function in face of uncertainty is typically evaluated through an n-dimensional integral, where n is the number of uncertain parameters. In this paper, suitable integration techniques are presented and computational issues are discussed in relation to the number of uncertain parameters and the uncertainty model considered. A specialized cubature technique, suitable to integrate normally distributed uncertaintie… Show more
“…The number of simulations, that are performed sampling the uncertain parameters according to the above-described quasi-Monte Carlo technique, is increased until the results converge to 10 -1 . It is found that 1000 simulations are required, which is in agreement with results reported by Bernardo et al (1999). Overall, the simulations require computation time on the order of several hours.…”
Section: Setting Up Simulation For Relaxation-based Dependability Anasupporting
confidence: 87%
“…In contrast to the random selection, a deterministic sequence of points can ensure that the most representative sample of the parameter space is obtained. It has been pointed out in the literature that these sampling methods only need to visit a small fraction of the space to obtain a representative answer (Bernardo et al, 1999). As Monte Carlo methods are often applied to problems where each function evaluation at a sample point is computationally expensive, as is the case in the problem studied here, simulations only on the order of hundreds to thousands can be performed in reasonable time.…”
Section: Brazilian Journal Of Chemical Engineeringmentioning
confidence: 99%
“…Other authors have applied different numerical integration methods to the problem of design under uncertainty (e.g., Straub andGrossmann, 1990 andBernardo et al, 1999). As Bernardo et al remark, the number of required samples grows exponentially with the dimensionality of the problem when quadrature methods, e.g., Gaussian quadrature, are used.…”
Section: Brazilian Journal Of Chemical Engineeringmentioning
confidence: 99%
“…Lastly, it should be noted that constraint relaxation in the context of this paper differs from the idea suggested by Bernardo et al (1999). They introduced a strategy to simplify the integration to determine the expected value of the objective function that they also call constraint relaxation.…”
Section: Relaxations As An Aid In Differentiating Between Numerical Amentioning
-A major challenge in chemical process design is to make design decisions based on partly incomplete or imperfect design input data. Still, process engineers are expected to design safe, dependable and cost-efficient processes under these conditions. The complexity of typical process models limits intuitive engineering estimates to judge the impact of uncertain parameters on the proposed design. In this work, an approach to quantify the effect of uncertainty on a process design in order to enhance comparisons among different designs is presented. To facilitate automation, a novel relaxation-based heuristic to differentiate between numerical and physical infeasibility when simulations do not converge is introduced. It is shown how this methodology yields more details about limitations of a studied process design.
“…The number of simulations, that are performed sampling the uncertain parameters according to the above-described quasi-Monte Carlo technique, is increased until the results converge to 10 -1 . It is found that 1000 simulations are required, which is in agreement with results reported by Bernardo et al (1999). Overall, the simulations require computation time on the order of several hours.…”
Section: Setting Up Simulation For Relaxation-based Dependability Anasupporting
confidence: 87%
“…In contrast to the random selection, a deterministic sequence of points can ensure that the most representative sample of the parameter space is obtained. It has been pointed out in the literature that these sampling methods only need to visit a small fraction of the space to obtain a representative answer (Bernardo et al, 1999). As Monte Carlo methods are often applied to problems where each function evaluation at a sample point is computationally expensive, as is the case in the problem studied here, simulations only on the order of hundreds to thousands can be performed in reasonable time.…”
Section: Brazilian Journal Of Chemical Engineeringmentioning
confidence: 99%
“…Other authors have applied different numerical integration methods to the problem of design under uncertainty (e.g., Straub andGrossmann, 1990 andBernardo et al, 1999). As Bernardo et al remark, the number of required samples grows exponentially with the dimensionality of the problem when quadrature methods, e.g., Gaussian quadrature, are used.…”
Section: Brazilian Journal Of Chemical Engineeringmentioning
confidence: 99%
“…Lastly, it should be noted that constraint relaxation in the context of this paper differs from the idea suggested by Bernardo et al (1999). They introduced a strategy to simplify the integration to determine the expected value of the objective function that they also call constraint relaxation.…”
Section: Relaxations As An Aid In Differentiating Between Numerical Amentioning
-A major challenge in chemical process design is to make design decisions based on partly incomplete or imperfect design input data. Still, process engineers are expected to design safe, dependable and cost-efficient processes under these conditions. The complexity of typical process models limits intuitive engineering estimates to judge the impact of uncertain parameters on the proposed design. In this work, an approach to quantify the effect of uncertainty on a process design in order to enhance comparisons among different designs is presented. To facilitate automation, a novel relaxation-based heuristic to differentiate between numerical and physical infeasibility when simulations do not converge is introduced. It is shown how this methodology yields more details about limitations of a studied process design.
“…The literature varies on the choice of risk measure, and how the decision-maker's preferences are taken into account. The prices in each scenario are independent by definition, but within each scenario the product prices are conspicuously correlated with each other, and not independent as stated for example in Bernardo et al (1999). Therefore, the prices are considered constant, and the amounts that go into each product as the random variables.…”
Section: Objective Function For the Refinery Including Stochastic Pricesmentioning
Abstract:The proposal which we wish to make is a two-stage stochastic programming model for a competitive oil refinery with stochastic crude and fuel prices. Most models for refineries are deterministic, and those considering the stochastic problem do so by utilising a Gaussian assumption on profitsimplementing variance as the risk measure. Our model falls into the category of optimisation with coherent risk measures where robustness, rather than ambiguity, is the focus. The objective is to maximise the refiner's profit under raw material, product inventory constraints and a financial risk constraint. The two-stage model leverages off a unique discrete scenario generation technique alongside an admissible and computational tractable drawdown risk measure. The expected value of perfect information calculation of each model gives a value for the additional benefit, which the decision-maker receives in considering the uncertainty inherent in the problem.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.