nstrumentation is needed in process plants to obtain data that are essential to perform several activities: control of plants, assessment of I the quality of products, production accounting (sometimes called yield accounting), detection of failures related to safety, parameter estimation. This review concentrates on the techniques for the optimal allocation of instruments in grassroots as well as in retrofit scenarios. The article is a condensed version of a book by the author (Bagajewicz, 2000).
Model-Based MonitoringModel-Based Monitoring consists of the use of a combination of models and on-line measurements to: a) produce estimates of measured and unmeasured variables; b) identify malfunctioning instrumentation; c) identify unsafe and/or faulty operating conditions and their origin; and d) identify events that can impact efficiency and quality of products.Kramer and Mah (1993), in an excellent overview of this issue, discussed several scenarios in which good estimates of data can be obtained. Data reconciliation and gross error detection are techniques that help accomplishing tasks (a) and (b) and constitute a particular case of the concept of data rectification. While data reconciliation r#elies on analytical constraints and mostly on least square estimation, data rectification can obtain these estimates using techniques like Kalman filtering, pattern recognition, neural networks, principal component analysis, partial least squares, among others. Likewise, the detection of faults relies on techniques based on statistics, but several other techniques can be used. Tasks (c) and (d) are direct consequences of using a model.The field of sensor network design and upgrading has traditionally relied on model-based concepts. Almost all work addressing monitoring goals includes:the ability of the sensor network to provide estimates of variables of interest; the ability to guarantee certain accuracy through data reconciliation; reliability; capability of identifying gross errors; and capability of identifying process faults.
Quality of DataQuality of data is a term that is often used to refer to several properties that a set of data should have. These properties can be condensed in the following three broad attributes: accuracy; precision or reproducibility;
E-mail address: bagajewicz@ ou.eduThe Canadi an Journal of Chemical Engineering, Volume 80, February 2002This article is a review of the area of optimal instrumentation location in process plants for both grassroots designs and upgrades. Emphasis is put in methods for monitoring purposes and based on the cost minimization model. These methods are mostly related to data reconciliation paradigms and include especial consideration to the detection of gross errors. Connections between this model and other models based on other objective functions and goals are analyzed. Finally, the emerging field of sensor design for efficient process fault detection is briefly covered.On passe en revue dans cet article la problematique du positionnement optimal de I'instrumentation ...