The Internet of Things (IoT) is driving the digital revolution. Almost all economic sectors are becoming "Smart" thanks to the analysis of data generated by IoT. This analysis is carried out by advance artificial intelligence (AI) techniques that provide insights never before imagined. The combination of both IoT and AI is giving rise to an emerging trend, called AIoT, which is opening up new paths to bring digitization into the new era. However, there is still a big gap between AI and IoT, which is basically in the computational power required by the former and the lack of computational resources offered by the latter. This is particularly true in rural IoT environments where the lack of connectivity (or low-bandwidth connections) and power supply forces the search for "efficient" alternatives to provide computational resources to IoT infrastructures without increasing power consumption. In this paper, we explore edge computing as a solution for bridging the gaps between AI and IoT in rural environment. We evaluate the training and inference stages of a deeplearning based precision agriculture application for frost prediction in modern Nvidia Jetson AGX Xavier in terms of performance and power consumption. Our experimental results reveal that cloud approaches are still a long way off in terms of performance, but the inclusion of GPUs in edge devices offers new opportunities for those scenarios where connectivity is still a challenge.
We are witnessing the dramatic consequences of the COVID-19 pandemic which, unfortunately, go beyond the impact on the health system. Until herd immunity is achieved with vaccines, the only available mechanisms for controlling the pandemic are quarantines, perimeter closures and social distancing with the aim of reducing mobility. Governments only apply these measures for a reduced period, since they involve the closure of economic activities such as tourism, cultural activities, or nightlife. The main criterion for establishing these measures and planning socioeconomic subsidies is the evolution of infections. However, the collapse of the health system and the unpredictability of human behavior, among others, make it difficult to predict this evolution in the short to medium term. This article evaluates different models for the early prediction of the evolution of the COVID-19 pandemic to create a decision support system for policy-makers. We consider a wide branch of models including artificial neural networks such as LSTM and GRU and statistically based models such as autoregressive (AR) or ARIMA. Moreover, several consensus strategies to ensemble all models into one system are proposed to obtain better results in this uncertain environment. Finally, a multivariate model that includes mobility data provided by Google is proposed to better forecast trend changes in the 14-day CI. A real case study in Spain is evaluated, providing very accurate results for the prediction of 14-day CI in scenarios with and without trend changes, reaching 0.93 $$R^2$$
R
2
, 4.16 RMSE and 1.08 MAE.
Agriculture is one of the key sectors where technology is opening new opportunities to break up the market. The Internet of Things (IoT) could reduce the production costs and increase the product quality by providing intelligence services via IoT analytics. However, the hard weather conditions and the lack of connectivity in this field limit the successful deployment of such services as they require both, ie, fully connected infrastructures and highly computational resources. Edge computing has emerged as a solution to bring computing power in close proximity to the sensors, providing energy savings, highly responsive web services, and the ability to mask transient cloud outages. In this paper, we propose an IoT monitoring system to activate anti-frost techniques to avoid crop loss, by defining two intelligent services to detect outliers caused by the sensor errors. The former is a nearest neighbor technique and the latter is the k-means algorithm, which provides better quality results but it increases the computational cost. Cloud versus edge computing approaches are analyzed by targeting two different low-power GPUs. Our experimental results show that cloud-based approaches provides highest performance in general but edge computing is a compelling alternative to mask transient cloud outages and provide highly responsive data analytic services in technologically hostile environments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.