Reactivity is an essential property of a synchronous program. Informally, it guarantees that at each instant the program fed with an input will 'react' producing an output. In the present work, we consider a refined property that we call feasible reactivity. Beyond reactivity, this property guarantees that at each instant both the size of the program and its reaction time are bounded by a polynomial in the size of the parameters at the beginning of the computation and the size of the largest input. We propose a method to annotate programs and we develop related static analysis techniques that guarantee feasible reactivity for programs expressed in the Sπ-calculus. The latter is a synchronous version of the π-calculus based on the SL synchronous programming model.
IntroductionMastering the computational complexity of programs is an important aspect of computer security with applications ranging from embedded systems to mobile code and smartcards. One approach to this problem is to monitor at run time the resource consumption and to rise an exception when some bound is reached. A variant of this approach is to instrument the code so that bounds are checked at appropriate time. An alternative approach is to analyse statically the program to guarantee that during the execution it will respect certain resource bounds. In other words, the first approach performs a dynamic verification while the second relies on a static analysis. As usual, the main advantage of the first approach is its flexibility while the advantage of the second approach is the fact that it does not introduce an overhead at run time and, perhaps more importantly, that it allows an early detection of 'buggy' programs. In this work, we will focus on the static analyses which offer the more challenging problems while keeping in mind that the two approaches are complementary. For instance, static analyses may be helpful in reducing the frequency of dynamic verifications. *