2020
DOI: 10.1088/1741-2552/ab6040
|View full text |Cite
|
Sign up to set email alerts
|

Decoding of single-trial EEG reveals unique states of functional brain connectivity that drive rapid speech categorization decisions

Abstract: Categorical perception (CP) is an inherent property of speech perception. The response time (RT) of listeners' perceptual speech identification is highly sensitive to individual differences. While the neural correlates of CP have been well studied in terms of the regional contributions of the brain to behavior, functional connectivity patterns that signify individual differences in listeners' speed (RT) for speech categorization is less clear. To address these questions, we applied several computational approa… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
26
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 23 publications
(28 citation statements)
references
References 106 publications
(163 reference statements)
2
26
0
Order By: Relevance
“…Similar results were obtained in a multivariate pattern decoding analysis of Luthra et al (in press), who showed left parietal (SMG) and right temporal (MTG) regions were among the most informative for describing moment-to-moment variability in categorization. Additionally, the link between MTG and PCG implied in our data points to a pathway between the neural substrates that map sounds to meaning and sensorimotor regions that execute motor commands (Al-Fahad et al, 2020;Du et al, 2014). Still, the early time course of these neural effects (~250 ms) occurs well before listeners' behavioral RTs (cf.…”
Section: Discussionmentioning
confidence: 56%
See 1 more Smart Citation
“…Similar results were obtained in a multivariate pattern decoding analysis of Luthra et al (in press), who showed left parietal (SMG) and right temporal (MTG) regions were among the most informative for describing moment-to-moment variability in categorization. Additionally, the link between MTG and PCG implied in our data points to a pathway between the neural substrates that map sounds to meaning and sensorimotor regions that execute motor commands (Al-Fahad et al, 2020;Du et al, 2014). Still, the early time course of these neural effects (~250 ms) occurs well before listeners' behavioral RTs (cf.…”
Section: Discussionmentioning
confidence: 56%
“…Parietal engagement is especially prominent when speech items are more perceptually confusable (Feng et al, 2018) or require added lexical readout as in Ganong paradigms (Oberfeld and Klöckner-Nowotny, 2016) and may serve as the sensory-motor interface for speech (Hickok et al, 2009;Hickok and Poeppel, 2000). Moreover, using machine learning to decode full brain EEG, we have recently shown that left SMG and related outputs from parietal cortex are among the most salient brain areas that code for category decisions (Al-Fahad et al, 2020;Mahmud et al, 2020).…”
Section: Discussionmentioning
confidence: 99%
“…Previous computational studies have found that ERPs averaged over 100 trials provided the best classification of data while maintaining reasonable signal SNR and computational efficiency (Al-Fahad et al, 2020;Mahmud et al, 2020). We quantified source-level ERPs with a mean bootstrapping approach (James et al, 2013) by randomly averaging over 100 trials (with replacement) times (Al-Fahad et al, 2020) for each stimulus condition per participant.…”
Section: Feature Extractionmentioning
confidence: 99%
“…For example, Bidelman et al demonstrated that brain responses in the time frame of 180-320 ms were more robust for phonetic prototypes vs. ambiguous speech tokens, thereby reflecting category-level processing (Bidelman et al, 2020a). Other studies have shown links between N1-P2 amplitudes of the auditory cortical ERPs and the strength of listeners' speech identification (Bidelman & Walker, 2017a) and labeling speeds (Al-Fahad et al, 2020) in speech categorization tasks (Bidelman et al, 2014;Bidelman & Alain, 2015). These findings are consistent with the notion that the early N1 and P2 waves of the ERPs are highly sensitive to speech processing and auditory object formation that is necessary to map sounds to meaning (Alain, 2007;Bidelman et al, 2013b;Wood et al, 1971).…”
Section: Introductionmentioning
confidence: 99%
“…However, we have recently shown that speech categories-those carrying a strong phonetic identity-are more resilient to noise degradation than their phonetically ambiguous counterparts (Bidelman et al, 2020a). Categorization recruits a wide variety of frontal, temporal, and parietal brain regions (Al-Fahad et al, 2020;Chang et al, 2010;Myers et al, 2009). Yet, category-level representations are highly prominent in inferior frontal gyrus (IFG) (Bidelman and Walker, 2019;Myers et al, 2009) and auditory cortex (AC) (Bidelman and Lee, 2015;Bidelman and Walker, 2019;Chang et al, 2010), suggesting frontotemporal interplay is an important driver of sound labeling.…”
Section: Introductionmentioning
confidence: 99%