The analytical development of Kullback-Leibler Divergence (KLD) or relative entropy, is used to measure the discrepancy between probability density functions (PDF), specifically among the PDFs of the generalized Gaussian distributions for Seizure/Non-Seizure signals. See [9-11] for some works on this topic in epilepsy and [12-14] for some applications in EEG signals. The remainder of this paper is structured as follows. Section 2 presents the proposed methodology and details the generalized Gaussian model and the analytical development of Kullback-Leibler divergence. Section 3 describes the experimentation on real EEG signals and the presents results obtained. Section 4 discusses the findings and provides perspectives for future work.