The emergence of the Internet of Things (IoT) and its subsequent evolution into the Internet of Everything (IoE) is a result of the rapid growth of information and communication technologies (ICT). However, implementing these technologies comes with certain obstacles, such as the limited availability of energy resources and processing power. Consequently, there is a need for energy-efficient and intelligent load-balancing models, particularly in healthcare, where real-time applications generate large volumes of data. This paper proposes a novel, energy-aware artificial intelligence (AI)-based load balancing model that employs the Chaotic Horse Ride Optimization Algorithm (CHROA) and big data analytics (BDA) for cloud-enabled IoT environments. The CHROA technique enhances the optimization capacity of the Horse Ride Optimization Algorithm (HROA) using chaotic principles. The proposed CHROA model balances the load, optimizes available energy resources using AI techniques, and is evaluated using various metrics. Experimental results show that the CHROA model outperforms existing models. For instance, while the Artificial Bee Colony (ABC), Gravitational Search Algorithm (GSA), and Whale Defense Algorithm with Firefly Algorithm (WD-FA) techniques attain average throughputs of 58.247 Kbps, 59.957 Kbps, and 60.819 Kbps, respectively, the CHROA model achieves an average throughput of 70.122 Kbps. The proposed CHROA-based model presents an innovative approach to intelligent load balancing and energy optimization in cloud-enabled IoT environments. The results highlight its potential to address critical challenges and contribute to developing efficient and sustainable IoT/IoE solutions.
Long Range Wireless Area Network (LoRaWAN) provides desirable solutions for Internet of Things (IoT) applications that require hundreds or thousands of actively connected devices (nodes) to monitor the environment or processes. In most cases, the location information of the devices arguably plays a critical role and is desirable. In this regard, the physical characteristics of the communication channel can be leveraged to provide a feasible and affordable node localisation solution. This paper presents an evaluation of the performance of LoRaWAN Received Signal Strength Indicator (RSSI)-based node localisation in a sandstorm environment. We employ machine learning algorithms -Support Vector Regression (SVR) and Gaussian Process Regression (GPR), which turns the high variance of RSSI due to frequency hopping feature of LoRaWAN to advantage; creating unique signatures representing different locations. In this work, the RSSI features are used as input location fingerprints into the machine learning models. The proposed method reduces node localisation complexity when compared to GPS-based approaches whilst provisioning more extensive connection paths. Furthermore, the impact of LoRa spreading factor and kernel function on the performance of the developed models have been studied. Experimental results show that the SVR-enhanced fingerprint yields the most significant improvement in node localisation performance.
Dementias that develop in older people test the limits of modern medicine. As far as dementia in older people goes, Alzheimer’s disease (AD) is by far the most prevalent form. For over fifty years, medical and exclusion criteria were used to diagnose AD, with an accuracy of only 85 per cent. This did not allow for a correct diagnosis, which could be validated only through postmortem examination. Diagnosis of AD can be sped up, and the course of the disease can be predicted by applying machine learning (ML) techniques to Magnetic Resonance Imaging (MRI) techniques. Dementia in specific seniors could be predicted using data from AD screenings and ML classifiers. Classifier performance for AD subjects can be enhanced by including demographic information from the MRI and the patient’s preexisting conditions. In this article, we have used the Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset. In addition, we proposed a framework for the AD/non-AD classification of dementia patients using longitudinal brain MRI features and Deep Belief Network (DBN) trained with the Mayfly Optimization Algorithm (MOA). An IoT-enabled portable MR imaging device is used to capture real-time patient MR images and identify anomalies in MRI scans to detect and classify AD. Our experiments validate that the predictive power of all models is greatly enhanced by including early information about comorbidities and medication characteristics. The random forest model outclasses other models in terms of precision. This research is the first to examine how AD forecasting can benefit from using multimodal time-series data. The ability to distinguish between healthy and diseased patients is demonstrated by the DBN-MOA accuracy of 97.456%, f-Score of 93.187 %, recall of 95.789 % and precision of 94.621% achieved by the proposed technique. The experimental results of this research demonstrate the efficacy, superiority, and applicability of the DBN-MOA algorithm developed for the purpose of AD diagnosis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.