In this paper we review a particular connection between information theory and group theory. We formalize the notions of information elements and information lattices, first proposed by Shannon. Exploiting this formalization, we expose a comprehensive parallelism between information lattices and subgroup lattices. Qualitatively, isomorphisms between information lattices and subgroup lattices are demonstrated. Quantitatively, a decisive approximation relation between the entropy structures of information lattices and the log-index structures of the corresponding subgroup lattices, first discovered by Chan and Yeung, is highlighted. This approximation, addressing both joint and common entropies, extends the work of Chan and Yeung on joint entropy. A consequence of this approximation result is that any continuous law holds in general for the entropies of information elements if and only if the same law holds in general for the log-indices of subgroups. As an application, by constructing subgroup counterexamples, we find surprisingly that common information, unlike joint information, obeys neither the submodularity nor the supermodularity law. We emphasize that the notion of information elements is conceptually significant-formalizing it helps to reveal the deep connection between information theory and group theory. The parallelism established in this paper admits an appealing group-action explanation and provides useful insights into the intrinsic structure among information elements from a group-theoretic perspective.