We study and compare modeling an end-to-end network by conventional, bivariate, and exponential observation hidden Markov processes. Furthermore, effects of μ-law, Lindle-Boyde-Gray, and uniform quantization approaches on the modeling granularity is explored. We performed experiments using synthetic representative data from a traffic-modeler autoregressive modular process and the Network Simulator software as well as over-the-Internet experiments with real data to contrast the fidelity produced from each model. Comparing statistical signatures of the model-generated data with those of the training sequence indicates that accompanying Lindle-BoydeGray quantization with conventional or bivariate hidden Markov processes significantly improves the modeling fidelity.
I. INTRODUCTIONNetwork simulation, for the purpose of application performance evaluation, using event-driven simulators such as Network Simulator (NS2) [1], Dummynet [2], and NIST [3] Net requires such a multitude of component settings, which makes large-scale networks inconducive to simulation. This has inspired developing simple generic models for networks, regardless of their size and complexity, by means of hidden Markov processes (HMPs) [4] Salamatian et al. [5] use conventional HMPs (CHMPs) [6] for packet loss modeling. Weiwei et al.[4] deploy bivariate HMPs (BHMPs) for delaybased network modeling in that end-to-end delay drastically affects the performance of applications running on the network. Therefore, designing network models based on such delays is insightful to applications performance as HMPgenerated observations represent actual delays the running applications will undergo. Since HMP parameters are the only means of generating the delays, selecting an HMP variation and its precise inference are of high consequence.In this paper, we contrast effectiveness of CHMPs, BHMPs, and exponential observation HMPs (EHMPs) [6] for the delaybased network modeling. CHMPs and BHMPs' requirement for discrete training data incorporates quantization into the modeling procedure. Since the quantized data are the only means of obtaining CHMPs and BHMPs, the quantization quality dramatically affects the modeling fidelity. Weiwei [25] simply deployed uniformly quantized data; however, long tail delay distributions in networks [7] induces a poor discretization for the uniform quantization. Such a choice adversely