2010
DOI: 10.1080/07408170903394306
|View full text |Cite
|
Sign up to set email alerts
|

DDDAS-based multi-fidelity simulation framework for supply chain systems

Abstract: Dynamic-Data-Driven Application Systems (DDDAS) is a new modeling and control paradigm which adaptively adjusts the fidelity of a simulation model. The fidelity of the simulation model is adjusted against available computational resources by incorporating dynamic data into the executing model, which then steers the measurement process for selective date update. To this end, comprehensive system architecture and methodologies are first proposed, where the components include a real-time DDDAS simulation, grid mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
26
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 50 publications
(26 citation statements)
references
References 31 publications
0
26
0
Order By: Relevance
“…It is noted that a level of fidelity affects both the simulation execution time as well as the time taken to collect required sensory updates. In our earlier work in DDDAMS (Celik et al 2007), we have developed four algorithms, which are embedded into the simulation to enable the DDDAMS capability (see Figure 1). The purpose of Algorithm 1 is to filter noise and detect an abnormal status of the system based on the measurement of the current sensory data (such as temperature and pressure).…”
Section: Overview Of Dddamsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is noted that a level of fidelity affects both the simulation execution time as well as the time taken to collect required sensory updates. In our earlier work in DDDAMS (Celik et al 2007), we have developed four algorithms, which are embedded into the simulation to enable the DDDAMS capability (see Figure 1). The purpose of Algorithm 1 is to filter noise and detect an abnormal status of the system based on the measurement of the current sensory data (such as temperature and pressure).…”
Section: Overview Of Dddamsmentioning
confidence: 99%
“…While it is true for the strategic and tactical levels, it becomes even more so at the operational level as the number of parameters as well as the frequency of update for each parameter grow significantly. In order to enable timely planning, monitoring, and control of these supply chains at the operational level in an economical and effective way, we have earlier proposed dynamic-data-driven adaptive multi-scale simulation (DDDAMS) architecture (Celik et al 2007). This research is believed as the first efforts available in the literature to 1) handle the dynamicity issue of the system by selectively incorporating up-to-date information into the simulation-based real-time controller, and 2) introduce adaptive simulations that are capable of adjusting their level of detail according to the altering conditions of a supply chain in the most economic way.…”
mentioning
confidence: 99%
“…Since real-time dynamic data can represent the up-to-date state of the environment, a new simulation paradigm is emerging. It entails the ability to dynamically incorporate additional real-time data when executing simulations, and promises much more accurate analysis and predictions [28]. This new dynamic data-driven simulation paradigm has been applied to a variety of research domains in recent years, including crisis management, environmental science, disaster forecasting, biotechnology, finance, and trade [29][30][31][32][33][34][35][36].…”
Section: The Data-driven Agent-based Modelingmentioning
confidence: 99%
“…Fueled by recent developments in data analytics and machine learning, data-driven approaches to building surrogate models have been gaining great popularity among diverse scientific disciplines. We now have a collection of techniques that have enabled progress across a wide spectrum of applications, including design optimization [1,2,3], the design of materials [4,5] and supply chains [6], model calibration [7,8,9], and uncertainty quantification [10,11,12,13,14,15,16,17,18,19,20,21]. Such approaches are built on the premise of treating the true data-generating process as a black-box, and try to construct parametric surrogates of some form y = f θ (x) directly from observed input-output pairs {x, y}.…”
Section: Introductionmentioning
confidence: 99%