2004
DOI: 10.1109/tbme.2004.824133
|View full text |Cite
|
Sign up to set email alerts
|

Characterization of Event Related Potentials Using Information Theoretic Distance Measures

Abstract: Analysis of event-related potentials (ERPs) using signal processing tools has become extremely widespread in recent years. Nonstationary signal processing tools such as wavelets and time-frequency distributions have proven to be especially effective in characterizing the transient phenomena encountered in event-related potentials. In this paper, we focus on the analysis of event-related potentials collected during a psychological experiment where two groups of subjects, spider phobics and snake phobics, are sh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0

Year Published

2007
2007
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 22 publications
(24 citation statements)
references
References 21 publications
0
24
0
Order By: Relevance
“…The PSD n represents again the probability density function, analogously to previous EEG studies. 2,24 Hence, the Re´nyi entropy can be considered as an alternative way to estimate the irregularity of timefrequency distributions. 3 This entropic form is parameterized by an entropic index q 2 <, in order that the Re´nyi entropy can be reduced to the BoltzmannGibbs entropy in the limit q fi 1.…”
Section: Definition Of Spectral Entropiesmentioning
confidence: 99%
See 2 more Smart Citations
“…The PSD n represents again the probability density function, analogously to previous EEG studies. 2,24 Hence, the Re´nyi entropy can be considered as an alternative way to estimate the irregularity of timefrequency distributions. 3 This entropic form is parameterized by an entropic index q 2 <, in order that the Re´nyi entropy can be reduced to the BoltzmannGibbs entropy in the limit q fi 1.…”
Section: Definition Of Spectral Entropiesmentioning
confidence: 99%
“…Previous studies have successfully applied several quantifiers based on extensive (e.g., Shannon and Re´nyi entropies) and non-extensive (e.g., Tsallis entropy) information measures to analyze EEG signals. 2,5,9,24,33,40,41 While extensive quantifiers, such as Shannon and Re´nyi entropies, have proved successful in describing systems with short-range interactions, non-extensive measures, such as Tsallis entropy, are able to describe a system where the effective interactions are of long-range. 5,40 In addition, MEG recordings, like EEG signals, are non-stationary and their characteristics may change over time.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The entropy of the resulting TFD surface is computed using Rényi entropy of order 3. The choice of the order of the Rényi entropy has been previously discussed in detail in [13]. The entropy values are normalized by log 2 L 2 = log 2 2500.…”
Section: B Complexity Analysismentioning
confidence: 99%
“…We use recent results in information-theoretic analysis of time-frequency distributions [12], [13] to define entropy and divergence functions based on this joint density function. Second, we extend the current work by considering the interactions between different electrodes as well as the complexity of the individual signals.…”
Section: Introductionmentioning
confidence: 99%