Abstract-Cyber-Physical Systems (CPS) have gained wide popularity, however, developing and debugging CPS remain significant challenges. Many bugs are detectable only at runtime under deployment conditions that may be unpredictable or at least unexpected at development time. The current state of the practice of debugging CPS is generally ad hoc, involving trial and error in a real deployment. For increased rigor, it is appealing to bring formal methods to CPS verification. However developers often eschew formal approaches due to complexity and lack of efficiency. This paper presents BraceAssertion, a specification framework based on natural language queries that are automatically converted to a determinitic class of timed automata used for runtime monitoring. To reduce runtime overhead and support properties that reference predicate logic, we use a second monitor automaton to create filtered traces on which to run the analysis using the specification monitor. We evaluate the BraceAssertion framework using a real CPS case study and show that the framework is able to minimize runtime overhead with an increasing number of monitors.
I. INTRODUCTIONCyber-Physical Systems (CPS) are found in applications in structural monitoring, autonomous vehicles, and many other fields. Compared with the growth of the domain, verification and validation of CPS lags far behind [32]. The state of the practice in debugging CPS generally entails a combination of simulation and in situ debugging. In a 2007 DARPA Urban Challenge Vehicle, a bug undetected by more than 300 miles of test-driving resulted in a near collision. An analysis of the incident found that, to protect the steering system, the interface to the physical hardware limited the steering rate to low speeds [23]. When the path planner produced a sharp turn at higher speeds, the vehicle physically could not follow, and this unanticipated situation caused the bug. The analysis concluded that, although simulation-centric tools are indispensable for rapid prototyping, design, and debugging, they are limited in providing correctness guarantees. State of the art formal methods tools, including static analysis, theorem proving, and model checking, are insufficient in tackling the challenges in CPS verification and validation [32]. Other verification techniques, including model-based testing [11] and simulation [16] have high learning curves, impractical development costs, and scalability issues. Domain specific tools (e.g., passive distributed assertions [29] and symbolic execution [30]), though more scalable, fail to formally verify either qualitative constraints (e.g., the ordering of events), quantitative ones (e.g., timing), or both. Ad hoc debugging has become the de facto