We study Bregman divergences in probability density space embedded with the L 2 -Wasserstein metric. Several properties and dualities of transport Bregman divergences are provided. In particular, we derive the transport Kullback-Leibler (KL) divergence by a Bregman divergence of negative Boltzmann-Shannon entropy in L 2 -Wasserstein space. We also derive analytical formulas and generalizations of transport KL divergence for one-dimensional probability densities and Gaussian families.Recently, optimal transport, a.k.a. Wasserstein distance, introduces the other type of distance functions in probability density space. It uses the pushforward mapping functions to measure differences between probability densities [32]. A particular example is the L 2 -Wasserstein distance, which forms an analog of L 2 distance between mapping functions. It also introduces a metric space for probability densities, namely the L 2 -Wasserstein space [3,30]. In this space, the L 2 -Wasserstein distance shows a particular convexity property towards mapping functions [3,24]. This convexity property nowadays has vast applications in fluid dynamics [11,16,17], inverse problems [35], and AI inference problems [2,13,31]. Natural questions arise. What are Bregman divergences in L 2 -Wasserstein space? In particular, what is the "KL divergence" in L 2 -Wasserstein space?