Abstract.In this paper, we study the resource allocation at the infrastructure level, instead of studying how to map the physical resources to virtual resources for better resource utilization in a cloud computing environment. We propose a new algorithm to resource allocation for infrastructure that dynamically allocate the virtual machines among the cloud computing applications based on approach algorithm deadlock detection and can use the threshold method to optimize the decision of resource reallocation. We have implemented and performed our algorithm proposed by using CloudSim simulator. The experiment results show that our algorithm can quickly detect deadlock and then resolve the situation of approximately orders of magnitude in practical cases.Keywords: Cloud Computing, Resource Allocation, Heterogeneous Platforms, Deadlock Detection.
Introduction"Recently, there has been a dramatic increase in the popularity of cloud computing systems that rent computing resources on-demand, bill on a pay-as-you-go basis, and multiply many users on the same physical infrastructure. These cloud computing environments provide an illusion of infinite computing resources to cloud users that they can increase or decrease their resources. In many cases, the need for these resources only exists in a very short period of time"Since them system of information and communication technology (ICT) was introduced, and has played a significant role in the lives of smart cities, the role of information technology infrastructure virtualization has contributed significantly to the solution of the major problem of the succession system of distributed computing, grid computing and parallel computing. In particular, tend to use cloud computing as a key is distributing virtual servers.
Over the past few years, using cloud computing technology has become popular. With the cloud computing service providers, reducing the number of physical machines providing resources for virtual services in cloud computing is one of the efficient ways to reduce the amount of energy consumption which in turn enhance the performance of data centres. However, using a minimum of physical machines to allocate resources for virtual services can result in system overload and break the SLA of service. Consequently, providing resources for virtual services which do not only satisfy the constraint of reducing the energy consumption but also ensure the load balancing of the whole system is necessary. In this study, we present the multi-objective resource allocation problem for virtual services. This problem aims at both reducing the energy consumption and balancing the load of physical machines. The MORA-ACS algorithm is proposed to resolve the problem by the Ant Colony System method. The experimental results show that in the CloudSim environment, the MORA-ACS algorithm could balance the load as well as reduce the energy consumption better than the Round Robin algorithm.
ARTICLE HISTORY
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.