In recent years, several schemes have been proposed to detect anomalies and attacks on Cyber-Physical Systems (CPSs) such as Industrial Control Systems (ICSs). Based on the analysis of sensor data, unexpected or malicious behavior is detected. Those schemes often rely on (implicit) assumptions on temporally stable sensor data distributions and invariants between process values. Unfortunately, the proposed schemes often do not perform optimally, with Recall scores lower than 70% (e.g., missing 3 alarms every 10 anomalies) for some ICS datasets, with unclear root issues. In this work, we propose a general framework to analyze whether a given ICS dataset has specific properties (stable sensor distributions in normal operations, potentially state-dependent), which then allows to determine whether certain Anomaly Detection approaches can be expected to perform well. We apply our framework to three datasets showing that the behavior of actuators and sensors are very different between Training set and Test set. In addition, we present high-level guides to consider when designing an Anomaly Detection System.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.