The design of control, estimation or diagnosis algorithms most often assumes that all available process variables represent the system state at the same instant of time. However, this is never true in current network systems, because of the unknown deterministic or stochastic transmission delays introduced by the communication network. During the diagnosing stage, this will often generate false alarms. Under nominal operation, the different transmission delays associated with the variables that appear in the computation form produce discrepancies of the residuals from zero. A technique aiming at the minimisation of the resulting false alarms rate, that is based on the explicit modelling of communication delays and on their best-case estimation is proposed.
IntroductionOwing to the growing complexity and spatial distribution of automated systems, communication networks have become the backbone of most control architecture. As systems are required to be more scalable and flexible, they have additional sensors, actuators and controllers, often referred to as field (intelligent) devices [1,2]. Networked control systems result from connecting these system components via a communication network such as controller area network (CAN), PROFIBUS or Ethernet. An increasing amount of research addresses the distributed control of inter-connected dynamical processes: stability and control [3][4][5], decision, co-ordination and scheduling [6,7], diagnosis of discrete event systems [8] and fault tolerance [9 -11]. However, only a few studies of the impact of the communication network on the diagnosis of continuous systems have recently been published [12 -14].In model-based fault detection and isolation (FDI), a set of residuals that should be ideally zero in the fault-free case and different from zero, in the faulty case are designed [15 -17]. However, in practice, residuals are different from zero, not only because of measurement noise, unknown inputs and modelling uncertainties but also because of transmission delays. Since no network can communicate instantaneously, data which are used in the residual computation do not represent the state of the system at the time of the computation. Instead, they represent the state of the system at some (often unknown) time prior to the computation. Moreover, each variable being possibly transmitted under a different transmission delay, the whole set of data that are used in the residual computation may even not be consistent with the system state at any moment prior to the computation. Therefore, residuals that should theoretically be zero in the non-faulty case might create false alarms as the result of transmission delays.The false alarms rate can be decreased by increasing the decision threshold, at the cost of reducing the sensitivity to faults. In this paper, a technique aiming at the minimisation of the false alarms caused by transmission delays without increasing the number of missed detection is proposed. It relies on the explicit modelling of communication delays, and...