With the recent advent of 4G LTE networks, there has been increasing interest to better understand the performance and power characteristics, compared with 3G/WiFi networks. In this paper, we take one of the first steps in this direction.Using a publicly deployed tool we designed for Android called 4GTest attracting more than 3000 users within 2 months and extensive local experiments, we study the network performance of LTE networks and compare with other types of mobile networks. We observe LTE generally has significantly higher downlink and uplink throughput than 3G and even WiFi, with a median value of 13Mbps and 6Mbps, respectively. We develop the first empirically derived comprehensive power model of a commercial LTE network with less than 6% error rate and state transitions matching the specifications. Using a comprehensive data set consisting of 5-month traces of 20 smartphone users, we carefully investigate the energy usage in 3G, LTE, and WiFi networks and evaluate the impact of configuring LTE-related parameters. Despite several new power saving improvements, we find that LTE is as much as 23 times less power efficient compared with WiFi, and even less power efficient than 3G, based on the user traces and the long high power tail is found to be a key contributor. In addition, we perform case studies of several popular applications on Android in LTE and identify that the performance bottleneck for web-based applications lies less in the network, compared to our previous study in 3G [24]. Instead, the device's processing power, despite the significant improvement compared to our analysis two years ago, becomes more of a bottleneck.
ABSTRACT3G cellular data networks have recently witnessed explosive growth. In this work, we focus on UMTS, one of the most popular 3G mobile communication technologies. Our work is the first to accurately infer, for any UMTS network, the state machine (both transitions and timer values) that guides the radio resource allocation policy through a light-weight probing scheme. We systematically characterize the impact of operational state machine settings by analyzing traces collected from a commercial UMTS network, and pinpoint the inefficiencies caused by the interplay between smartphone applications and the state machine behavior. Besides basic characterizations, we explore the optimal state machine settings in terms of several critical timer values evaluated using real network traces. Our findings suggest that the fundamental limitation of the current state machine design is its static nature of treating all traffic according to the same inactivity timers, making it difficult to balance tradeoffs among radio resource usage efficiency, network management overhead, device radio energy consumption, and performance. To the best of our knowledge, our work is the first empirical study that employs real cellular traces to investigate the optimality of UMTS state machine configurations. Our analysis also demonstrates that traffic patterns impose significant impact on radio resource and energy consumption. In particular, We propose a simple improvement that reduces YouTube streaming energy by 80% by leveraging an existing feature called fast dormancy supported by the 3GPP specifications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.