Current IoT trends reveal an increase in computational requirements for data processing. Traditionally, data from sensors was uploaded to compute nodes at a backend cloud. Nevertheless, ever-growing amount of data generated by IoT devices have rendered this option too expensive in terms of network traffic, possibly leading to delays due to bottlenecks. Moreover, even if network connectivity were to be guaranteed, live processing of sensitive data (e.g.: biomedical) at a remote location may not comply with data protection policies. A popular approach tries to circumvent these issues by performing computational operations locally, that is, at the IoT Gateway level. This demo leverages open source lightweight virtualization tools and a container orchestration engine (i.e.: Docker and Kubernetes, respectively) in an cluster of IoT devices at the edge of the network, enabling the creation of a distributed pool of computing resources on top of which data analytics algorithms could be deployed, updated, or terminated. This approach guarantees that resource-hungry operations, such as live monitoring and real-time processing of sensitive data, are performed locally, reducing the overall delay and without risking data leaking to the outside world.