“…By expanding on the classical concept of mutual information (Shannon 1948) that quantifies the shared information between two random variables, Schreiber (2000) introduced transfer entropy as a measure of the asymmetry in interaction between two coupled stochastic processes. Since its inception, transfer entropy has emerged as the prevalent choice for studying pairwise or dyadic interactions in a wide-range of complex systems, for example in quantifying the directional connectivity and inferring network topology in brain functioning (Staniek and Lehnertz 2008, Vicente et al 2011, Stetter et al 2012, identifying leadership behavior in groups and pairwise interactions between animals (Butail et al 2016, Lord et al 2016, Neri et al 2017, Shaffer and Abaid 2020, Valentini et al 2021, studying complex connections in climate science (Hlinka et al 2013, Campuzano et al 2018, inferring causality in stocks and finance (Sandoval Jr 2014, He andShang 2017), and understanding causal influences in social media and human behavior (Borge-Holthoefer et al 2016, Porfiri et al 2019.…”