1997
DOI: 10.1002/aic.690430414
|View full text |Cite
|
Sign up to set email alerts
|

Inference in dynamic error‐in‐variable‐measurement problems

Abstract: Efficient algorithms were developed for estimating model parameters from measured data, even in the presence of gross errors. In addition to point estimates of parameters, however, assessments of uncertainty are needed. Linear approximations provide standard errors, but they can be misleading when applied to models that are substantially nonlinear. To overcome this difficulg, profiling methods were developed for the case in which the regressor variables are error-free. These methods provide accurate nonlinear … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

1998
1998
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 26 publications
0
7
0
Order By: Relevance
“…After minimization of the plant–model mismatch, this model is used for process optimization using an economic objective function. Because essentially the same detailed model can be used for data reconciliation and parameter estimation (and later process optimization), various authors advocate that these two tasks should be considered simultaneously through the solution of a large online optimization problem, using a global errors-in-variables approach, for which decomposition strategies are available to speed its solution. , A number of industrial reports reinforce the importance of the parameter estimation task to the success of developing adequate models for the application of model-based supervision strategies to FCC units. …”
Section: Monitoring Of Process and Quality Variablesmentioning
confidence: 99%
“…After minimization of the plant–model mismatch, this model is used for process optimization using an economic objective function. Because essentially the same detailed model can be used for data reconciliation and parameter estimation (and later process optimization), various authors advocate that these two tasks should be considered simultaneously through the solution of a large online optimization problem, using a global errors-in-variables approach, for which decomposition strategies are available to speed its solution. , A number of industrial reports reinforce the importance of the parameter estimation task to the success of developing adequate models for the application of model-based supervision strategies to FCC units. …”
Section: Monitoring Of Process and Quality Variablesmentioning
confidence: 99%
“…Interactive software, hardware, and information technology systems are qualitatively represented in Figure . Certain reconciled datasets are acquired from the historical server of the DCS and used to initialize the so-called “Error-In-Variable” method (EVM, described later), which is an optimization problem aimed at estimating the adaptive parameters and, hence, adapting the detailed process simulation to the current plant operating conditions. The adaptive parameters are used to simulate the process and reproduce a reliable and coherent picture of the operating conditions; in this case, the data used to initialize the process simulation are the fresh data coming from the field (not yet reconciled).…”
Section: Architecture Of the Cape Solutionmentioning
confidence: 99%
“…The second problem to be solved is the so-called “Error-in-Variable” method (EVM), originally proposed by Biegler, to whom one could refer for more details. ,, Briefly, it is formulated as follows: min boldx i , normalθ nobreak.25em normalΦ = i = 1 SSC ( m i x i ) T Q 1 false( boldm i boldx i false) subject to f false( boldx i , boldθ false) = 0 where Φ is the is the objective function; x and m are the vectors of reconciled and measured values, respectively; Q is the positive definite diagonal matrix of weights; g ( x ) = 0 and h ( x ) ≤ 0 are equality and inequality constraints, respectively, to which the minimization problem is subjected. The key point of EVM is its large-size dimension, with respect to the data reconciliation, since its degrees of freedom are the adaptive parameters θ and the reconciled vectors of each steady-state condition (SSC) acquired by the historical server of the DCS.…”
Section: Architecture Of the Cape Solutionmentioning
confidence: 99%
See 1 more Smart Citation
“…Data reconciliation has been largely studied, and several approaches and techniques have been proposed in the literature, but from a process and chemical point of view, its applications have been mainly reduced to solve daily/weekly production accounting as well as approximate performance monitoring issues. Moreover, facing the most recent promising approaches such as dynamic reconciliation, model-based performance monitoring, , etc., only the simplest methods have been used to practically tackle the aforementioned problems.…”
Section: Introductionmentioning
confidence: 99%