Data centers are integral part of cloud computing that support Web services, online social networking, data analysis, computation intensive applications and scientific computing. They require high performance components for their interprocess communication, storage and sub-communication systems. The performance bottleneck that used to be the processing power has now been shifted to communication speed within data centers. The performance of a data center, in terms of throughput and delay, is directly related to the performance of the underlying internal communication network.In this paper, we introduce an analytical model that can be used to evaluate the underlying network architecture in data centers. The model can further be used to develop simulation tools that extend the scope of performance evaluation beyond what it can be achieved by the theoretical model in terms of various network topologies, different traffic distributions, scalability, and load balancing. While the model is generic, we focus on its implementation for fat-tree networks that are widely used in data centers. The theoretical results are compared and validated with the simulation results for several network configurations. The results of this analysis provide a basis for data center network design and optimization.
Internet of Things (IoT) devices, particularly those used for sensor networks, are often latency-sensitive devices. The topology of the sensor network largely depends on the overall system application. Various configurations include linear, star, hierarchical and mesh in 2D or 3D deployments. Other applications include underwater communication with high attenuation of radio waves, disaster relief networks, rural networking, environmental monitoring networks, and vehicular networks. These networks all share the same characteristics, including link latency, latency variation (jitter), and tail latency. Achieving a predictable performance is critical for many interactive and latency-sensitive applications. In this paper, a two-stage tandem queuing model is developed to estimate the average end-to-end latency and predict the latency variation in closed forms. This model also provides a feedback mechanism to investigate other major performance metrics, such as utilization, and the optimal number of computing units needed in a single cluster. The model is applied for two classes of networks, namely, Edge Sensor Networks (ESNs) and Data Center Networks (DCNs). While the proposed model is theoretically derived from a queuing-based model, the simulation results of various network topologies and under different traffic conditions prove the accuracy of our model.
Medical image processing becomes a hot research topic in healthcare sector for effective decision making and diagnoses of diseases. Magnetic resonance imaging (MRI) is a widely utilized tool for the classification and detection of prostate cancer. Since the manual screening process of prostate cancer is difficult, automated diagnostic methods become essential. This study develops a novel Deep Learning based Prostate Cancer Classification (DTL-PSCC) model using MRI images. The presented DTL-PSCC technique encompasses EfficientNet based feature extractor for the generation of a set of feature vectors. In addition, the fuzzy k-nearest neighbour (FKNN) model is utilized for classification process where the class labels are allotted to the input MRI images. Moreover, the membership value of the FKNN model can be optimally tuned by the use of krill herd algorithm (KHA) which results in improved classification performance. In order to demonstrate the good classification outcome of the DTL-PSCC technique, a wide range of simulations take place on benchmark MRI datasets. The extensive comparative results ensured the betterment of the DTL-PSCC technique over the recent methods with the maximum accuracy of 85.09%.
Infodemiology uses web-based data to inform public health policymakers. This study aimed to examine the diffusion of Arabic language discussions and analyze the nature of Internet search behaviors related to the global COVID-19 pandemic through two platforms (Twitter and Google Trends) in Saudi Arabia. A set of Twitter Arabic data related to COVID-19 was collected and analyzed. Using Google Trends, internet search behaviors related to the pandemic were explored. Health and risk perceptions and information related to the adoption of COVID-19 infodemic markers were investigated. Moreover, Google mobility data was used to assess the relationship between different community activities and the pandemic transmission rate. The same data was used to investigate how changes in mobility could predict new COVID-19 cases. The results show that the top COVID-19–related terms for misinformation on Twitter were folk remedies from low quality sources. The number of COVID-19 cases in different Saudi provinces has a strong negative correlation with COVID-19 search queries on Google Trends (Pearson r = −0.63) and a statistical significance (p < 0.05). The reduction of mobility is highly correlated with a decreased number of total cases in Saudi Arabia. Finally, the total cases are the most significant predictor of the new COVID-19 cases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.