Abstract-We present PIE, a scalable routing scheme that achieves 100% packet delivery and low path stretch. It is easy to implement in a distributed fashion and works well when costs are associated to links. Scalability is achieved by using virtual coordinates in a space of concise dimensionality, which enables greedy routing based only on local knowledge. PIE is a general routing scheme, meaning that it works on any graph. We focus however on the Internet, where routing scalability is an urgent concern. We show analytically and by using simulation that the scheme scales extremely well on Internet-like graphs. In addition, its geometric nature allows it to react efficiently to topological changes or failures by finding new paths in the network at no cost, yielding better delivery ratios than standard algorithms. The proposed routing scheme needs an amount of memory polylogarithmic in the size of the network and requires only local communication between the nodes. Although each node constructs its coordinates and routes packets locally, the path stretch remains extremely low, even lower than for centralized or less scalable state-of-the-art algorithms: PIE always finds short paths and often enough finds the shortest paths.
Information centric networks (ICNs) allow content objects to be cached within the network, so as to provide efficient data delivery. Existing works on in-network caches mainly focus on minimizing the redundancy of caches to improve the cache hit ratio, which may not lead to significant bandwidth saving. On the other hand, it could result in too frequent caching operations, i.e., cache placement and replacement, causing more power consumption at nodes, which shall be avoided in energy-limited data delivery environments, e.g., wireless networks. In this paper, we propose a distributed caching strategy along the data delivery path, called MAGIC (MAx-Gain In-network Caching). MAGIC aims to reduce bandwidth consumption by jointly considering the content popularity and hop reduction. We also take the cache replacement penalty into account when making cache placement decisions to reduce the number of caching operations. We compare our caching strategy with several state-of-art caching strategies in ICNs. Our results show that the MAGIC strategy can reduce up to 34.50% bandwidth consumption, save up to 17.91% server hit ratio, and reduce up to 38.84% caching operations compared with the existing best caching strategy when cache size is small, which is a significant improvement in wireless networks with limited cache size at each wireless node.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.