The resource provisioning is one of the challenging problems in the cloud environment. The resources should be allocated dynamically according to the demand changes of the applications. Over-provisioning increases energy wasting and costs. On the other hand, under-provisioning causes Service Level Agreements (SLA) violation and Quality of Service (QoS) dropping. Therefore the allocated resources should be close to the current demand of applications as much as possible. For this purpose, the future demand of applications should be determined. Thus, the prediction of the future workload of applications is an essential step before the resource provisioning. To the best of our knowledge, for the first time, this paper proposes a novel Prediction mOdel based on SequentIal paTtern mINinG (POSITING) that considers correlation between different resources and extracts behavioural patterns of applications independently of the fixed pattern length explicitly. Based on the extracted patterns and the recent behaviour of the application, the future demand of resources is predicted. The main goal of this paper is to show that models based on pattern mining could offer novel and useful points of view for tackling some of the issues involved in predicting the application workloads. The performance of the proposed model is evaluated based on both real and synthetic workloads. The experimental results show that the proposed model could improve the prediction accuracy in comparison to the other state-of-the-art methods such as moving average, linear regression, neural networks and hybrid prediction approaches.
Gaming on demand is an emerging service that has recently started to garner prominence in the gaming industry. Cloud-based video games provide affordable, flexible, and high-performance solutions for end-users with constrained computing resources and enables them to play high-end graphic games on low-end thin clients. Despite its advantages, cloud gaming's Quality of Experience (QoE) suffers from high and varying end-to-end delay. Since the significant part of computational processing, including game rendering and video compression, is performed in data centers, controlling the transfer of information within the cloud has an important impact on the quality of cloud gaming services. In this article, a novel method for minimizing the end-to-end latency within a cloud gaming data center is proposed. We formulate an optimization problem for reducing delay, and propose a Lagrangian Relaxation (LR) time-efficient heuristic algorithm as a practical solution. Simulation results indicate that the heuristic method can provide close-to-optimal solutions. Also, the proposed model reduces end-to-end delay and delay variation by almost 11% and 13.5%, respectively, and outperforms the existing server-centric and network-centric models. As a byproduct, our proposed method also achieves better fairness among multiple competing players by almost 45%, on average, in comparison with existing methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations –citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.