<p style='text-indent:20px;'>It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, which extends the concept of Shannon entropy, plays a fundamental role. Given an <i>a priori</i> probability kernel <inline-formula><tex-math id="M1">\begin{document}$ \hat{\nu} $\end{document}</tex-math></inline-formula> and a probability <inline-formula><tex-math id="M2">\begin{document}$ \pi $\end{document}</tex-math></inline-formula> on the measurable space <inline-formula><tex-math id="M3">\begin{document}$ X\times Y $\end{document}</tex-math></inline-formula> we consider an appropriate definition of entropy of <inline-formula><tex-math id="M4">\begin{document}$ \pi $\end{document}</tex-math></inline-formula> relative to <inline-formula><tex-math id="M5">\begin{document}$ \hat{\nu} $\end{document}</tex-math></inline-formula>, which is based on previous works. Using this concept of entropy we obtain a natural definition of information gain for general measurable spaces which coincides with the mutual information given from the K-L divergence in the case <inline-formula><tex-math id="M6">\begin{document}$ \hat{\nu} $\end{document}</tex-math></inline-formula> is identified with a probability <inline-formula><tex-math id="M7">\begin{document}$ \nu $\end{document}</tex-math></inline-formula> on <inline-formula><tex-math id="M8">\begin{document}$ X $\end{document}</tex-math></inline-formula>. This will be used to extend the meaning of specific information gain and dynamical entropy production to the model of thermodynamic formalism for symbolic dynamics over a compact alphabet (TFCA model). Via the concepts of involution kernel and dual potential, one can ask if a given potential is symmetric - the relevant information is available in the potential. In the affirmative case, its corresponding equilibrium state has zero entropy production.</p>