IntroductionSeveral measures have been introduced in the literature on information theory and statistics as measures of information. The most commonly used in information theory is Shannon's entropy [1]. It gives the amount of uncertainty concerning the outcome of an experiment. Kullback and Leibler[2] introduced a measure associated with two distributions of an experiment. It expresses the amount of information supplied by the data for discriminating among the distributions. As symmetric measure, the Jeffreys-Kullback-Leibler J-divergence is commonly used. Furthermore, the measure arising owing to concavity of Shannon's entropy known as information Radius [3], is also gaining importance towards applications. There exists a beautiful inequality [4,5] between information radius and J-divergence. Burbea and Rao[4,6] called information radius as Jensen difference divergence measure and also presented its one scalar parametric generalization. Some properties and applications of Jensen difference divergence measure and its generalizations can be seen in [4,[6][7][8][9][10]. Recently, Taneja[11] introduced different ways to generalize Jensen difference divergence measure having two scalar parameters, while two scalar parametric generalizations of J-divergence can be seen in Taneja [11][12][13][14].On the other side, the most famous measure in the literature of statistics is the Fisher's[15] measure of information. It measures the amount of information supplied by the data about an unknown parameter θ. Rao[16] introduced the Riemannian metric in a model of probability distributions by using the Fisher information matrix. C