Abstract-A method to perform convolutive blind source separation of super-Gaussian sources by minimizing the mutual information between segments of output signals is presented. The proposed approach is essentially an implementation of an idea previously proposed by Pham. The formulation of mutual information in the proposed criterion makes use of a nonparametric estimator of Renyi's -entropy, which becomes Shannon's entropy in the limit as approaches 1. Since can be any number greater than 0, this produces a family of criteria having an infinite number of members. Interestingly, it appears that Shannon's entropy cannot be used for convolutive source separation with this type of estimator. In fact, only one value of appears to be appropriate, namely = 2, which corresponds to Renyi's quadratic entropy. Four experiments are included to show the efficacy of the proposed criterion.Index Terms-Convolutive blind source separation (BSS), information theoretic learning, nonparametric entropy estimator, Renyi's entropy.