“…The mutual information is related to the marginal entropies and the joint entropy of these random variables through (Cover and Thomas, 1991) Minimization of output mutual information is "the canonical contrast for source separation" as Cardoso states (Cardoso and Souloumiac, 1993). Many researchers agree with this comment (Yang and Amari, 1997;Hyvarinen, 1999a;Almeida, 2000). However, three of the most well known methods for ICA, namely JADE (Cardoso and Souloumniac, 1993), Infomax (Bell and Sejnowski, 1995), and FastICA (Hyvarinen, 1999b), use the diagonalization of cumulant matrices, maximization of output entropy, and fourth order-cumulants, respectively.…”