We propose an entropy-based information measure, namely the Discounted Least Information Theory of Entropy (DLITE), which not only exhibits important characteristics expected as an information measure but also satisfies conditions of a metric. Classic information measures such as Shannon Entropy, KL Divergence, and Jessen-Shannon Divergence have manifested some of these properties while missing others. This work fills an important gap in the advancement of information theory and its application, where related properties are desirable.