Abstract-In this paper, we focus on reducing the on-grid energy consumption in Heterogeneous Radio Access Networks (HetNets) supplied with hybrid power sources (grid and renewables). The energy efficiency problem is analyzed over both short-and long-timescales by means of reactive and proactive management strategies. For short-timescale case, a renewable-energy aware User Equipment (UE)-Base Station (BS) association is proposed and analyzed for the cases when no storage infrastructure is available. For long-timescale case, a traffic flow method is proposed for load balancing in renewable energy BSs, which is combined with a model predictive controller (MPC) to include forecast capabilities of the renewable energy source behavior in order to better exploit a Green HetNet with storage support. The mechanisms are evaluated with data of solar measurements from the region of Valle de Aburrá, Medellín, Colombia and wind estimations from the Moscow region, Russian Federation. Results show how the green association proposal can reduce ongrid energy consumption in a HetNet by up to 34%, while is able to exceed the savings obtained by other methods, including the best-signal level policy by up to 15%, additionally providing high network efficiency and low computational complexity. For the long-timescale case, MPC attainable savings can be up to 22% with respect to the on-grid only Macro-BS approach. Finally, an analysis of our proposals in a common scenario is included, which highlights the relevance of storage management, although emphasizing the importance of combining reactive and proactive methods in a common framework to exploit the best of each approach.
IEEE 802.11 (Wi-Fi) is one of the technologies that provides high performance with a high density of connected devices to support emerging demanding services, such as virtual and augmented reality. However, in highly dense deployments, Wi-Fi performance is severely affected by interference. This problem is even worse in new standards, such as 802.11n/ac, where new features such as Channel Bonding (CB) are introduced to increase network capacity but at the cost of using wider spectrum channels. Finding the best channel assignment in dense deployments under dynamic environments with CB is challenging, given its combinatorial nature. Therefore, the use of analytical or system models to predict Wi-Fi performance after potential changes (e.g., dynamic channel selection with CB, and the deployment of new devices) are not suitable, due to either low accuracy or high computational cost. This paper presents a novel, data-driven approach to speed up this process, using a Graph Neural Network (GNN) model that exploits the information carried in the deployment’s topology and the intricate wireless interactions to predict Wi-Fi performance with high accuracy. The evaluation results show that preserving the graph structure in the learning process obtains a 64% increase versus a naive approach, and around 55% compared to other Machine Learning (ML) approaches when using all training features.
Data centers have been gaining increasing interest due to their capacity to store a large amount of information; as a result they have become the main support for the deployment of cloud computing. Currently, data centers face performance problems, such as: lack of guaranteed quality of service, security risks, management complexity and inflexibility. Virtualized data centers are an emerging tool to address these issues. However, their implementation brings a major challenge: How to optimally allocate the resource demands of virtual data centers on top of the physical infrastructure, so that the data center operating costs are reduced, improving the revenue for infrastructure providers and fulfilling the agreements of quality of service with the end users? This challenge is known as the Virtual Data Center Embedding problem. This paper presents survey of current research in this problem, and also proposes a taxonomic classification of the main approaches to solve it. Finally, opportunities for future research in resource allocation for virtualized data centers are discussed.
Previous research has shown the relationship between the number of users connected to a cellular network base station (BS) and its energy consumption. For this reason, the study of optimal mechanisms that balance the load of users over the available BSs is a key element in the field of energy efficiency in cellular networks. The target of this paper is to propose and assess different user–BS association mechanisms to reduce grid consumption in heterogeneous cellular networks powered by hybrid energy sources (grid and renewable energy). These schemes are compared with the traditional best‐signal‐level mechanism and evaluated via simulation by using key performance indicators related to grid consumption, number of users served, and average transmission rate per user. Our results show that the new proposed user allocation policies improve grid electricity consumption while reducing unserved users compared with the traditional association scheme.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.