Collecting vast amount of data and performing complex calculations to feed modern Numerical Weather Prediction (NWP) algorithms require to centralize intelligence into some of the most powerful energy and resource hungry supercomputers in the world. This is due to the chaotic complex nature of the atmosphere which interpretation require virtually unlimited computing and storage resources. With Machine Learning (ML) techniques, a statistical approach can be designed in order to perform weather forecasting activity. Moreover, the recently growing interest in Edge Computing Tiny Intelligent architectures is proposing a shift towards the deployment of ML algorithms on Tiny Embedded Systems (ES). This paper describes how Deep but Tiny Neural Networks (DTNN) can be designed to be parsimonious and can be automatically converted into a STM32 microcontroller-optimized C-library through X-CUBE-AI toolchain; we propose the integration of the obtained library with Miosix, a Real Time Operating System (RTOS) tailored for resource constrained and tiny processors, which is an enabling factor for system scalability and multi tasking. With our experiments we demonstrate that it is possible to deploy a DTNN, with a FLASH and RAM occupation of 45,5 KByte and 480 Byte respectively, for atmospheric pressure forecasting in an affordable cost effective system. We deployed the system in a real context, obtaining the same prediction quality as the same DNN model deployed on the cloud but with the advantage of processing all the necessary data to perform the prediction close to environmental sensors, avoiding raw data traffic to the cloud.