Abstract. Ambient assisted living technologies and services make it possible to help elderly and impaired people and increase their personal autonomy. Specifically, vision-based approaches enable the recognition of human behaviour, which in turn allows to build valuable services upon. However, a main constraint is that these have to be able to work online and in real time. In this work, a human action recognition method based on a bag-of-key-poses model and sequence alignment is extended to support continuous human action recognition. The detection of action zones is proposed to locate the most discriminative segments of an action. For the recognition, a method based on a sliding and growing window approach is presented. Furthermore, an evaluation scheme particularly designed for ambient assisted living scenarios is introduced. Experimental results on two publicly available datasets are provided. These show that the proposed action zones lead to a significant improvement and allow real-time processing.