Performance evaluation of the 802.11 MAC protocol is classically based on the decoupling assumption, which hypothesizes that the backoff processes at different nodes are independent. This decoupling assumption results from mean field convergence and is generally true in transient regime in the asymptotic sense (when the number of wireless nodes tends to infinity), but, contrary to widespread belief, may not necessarily hold in stationary regime. The issue is often related with the existence and uniqueness of a solution to a fixed point equation; however, it was also recently shown that this condition is not sufficient; in contrast, a sufficient condition is a global stability property of the associated ordinary differential equation. In this paper, we give a simple condition that establishes the asymptotic validity of the decoupling assumption for the homogeneous case (all nodes have the same parameters). We also discuss the heterogeneous and the differentiated service cases and formulate a new ordinary differential equation. We show that the uniqueness of a solution to the associated fixed point equation is not sufficient; we exhibit one case where the fixed point equation has a unique solution but the decoupling assumption is not valid in the asymptotic sense in stationary regime.
This paper discovers fundamental principles of the backoff process that governs the performance of IEEE 802.11. A simplistic principle founded upon regular variation theory is that the backoff time has a truncated Pareto-type tail distribution with an exponent of (log γ)/ log m (m is the multiplicative factor and γ is the collision probability). This reveals that the per-node backoff process is heavy-tailed in the strict sense for γ > 1/m 2 , and paves the way for the following unifying result. The state-of-the-art theory on the superposition of the heavy-tailed processes is applied to establish a dichotomy exhibited by the aggregate backoff process, putting emphasis on the importance of time-scales on which we view the backoff processes. While the aggregation on normal time-scales leads to a Poisson process, it is approximated by a new limiting process possessing long-range dependence (LRD) on coarse time-scales. This dichotomy turns out to be instrumental in formulating short-term fairness, extending existing formulas to arbitrary population, and to elucidate the absence of LRD in practical situations. A refined wavelet analysis is conducted to strengthen this argument.
Abstract-We present a network architecture for the distributed utility max-min flow control of elastic and nonelastic flows where utility values of users (rather than data rates of users) are enforced to achieve max-min fairness. The proposed link algorithm converges to utility max-min fair bandwidth allocation in the presence of round-trip delays without using the information of users' utility functions. To show that the proposed algorithm can be stabilized not locally but globally, we found that the use of nonlinear control theory is inevitable. Even though we use a distributed flow-control algorithm, it is shown that any kind of utility function can be used as long as the minimum slopes of the functions are greater than a certain positive value. Though our analysis is limited to the single-bottleneck and homogeneous-delay case, we believe that the proposed algorithm is the first to achieve utility max-min fairness with guaranteed stability in a distributed manner.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.