2015
DOI: 10.1080/2326263x.2015.1063363
|View full text |Cite
|
Sign up to set email alerts
|

Identifying the attended speaker using electrocorticographic (ECoG) signals

Abstract: People affected by severe neuro-degenerative diseases (e.g., late-stage amyotrophic lateral sclerosis (ALS) or locked-in syndrome) eventually lose all muscular control. Thus, they cannot use traditional assistive communication devices that depend on muscle control, or brain-computer interfaces (BCIs) that depend on the ability to control gaze. While auditory and tactile BCIs can provide communication to such individuals, their use typically entails an artificial mapping between the stimulus and the communicati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
22
0
1

Year Published

2016
2016
2020
2020

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 28 publications
(23 citation statements)
references
References 52 publications
0
22
0
1
Order By: Relevance
“…Such results have been obtained during a large variety of tasks, such as auditory perception (Crone et al, 2001; Canolty et al, 2007; Edwards et al, 2009; Potes et al, 2012), motor movements (Crone et al, 1998; Miller et al, 2007), visual spatial attention (Gunduz et al, 2011, 2012), auditory attention (Golumbic et al, 2013; Mesgarani and Chang, 2012; Dijkstra et al, 2015) or imagined speech (Pei et al, 2011a,b). Here, we used the broadband gamma response to locate areas that responded to the auditory and motor tasks.…”
Section: Discussionmentioning
confidence: 83%
“…Such results have been obtained during a large variety of tasks, such as auditory perception (Crone et al, 2001; Canolty et al, 2007; Edwards et al, 2009; Potes et al, 2012), motor movements (Crone et al, 1998; Miller et al, 2007), visual spatial attention (Gunduz et al, 2011, 2012), auditory attention (Golumbic et al, 2013; Mesgarani and Chang, 2012; Dijkstra et al, 2015) or imagined speech (Pei et al, 2011a,b). Here, we used the broadband gamma response to locate areas that responded to the auditory and motor tasks.…”
Section: Discussionmentioning
confidence: 83%
“…These differences in neural representations of attended and unattended speech have subsequently been shown to allow us to decode from brain activity which speaker someone is paying attention to: auditory attention decoding [7][8][9][10][11] . This provides a proof of concept of the neurosciencific phenomenon, but also opens the path to develop neurotechnology that exploits this decoding.…”
Section: Introductionmentioning
confidence: 99%
“…Auditory attention decoding approaches take advantage of features of the speech signal that are known to be selectively enhanced for attended over unattended speech. Examples of such speech features are for instance the changes in sound intensity over time (the speech envelope) [7][8][9]11 , or the changes in intensity over time across frequency bands (the speech spectogram) 12,13 . Generally, a (regularized) regression is used to learn a mapping from the subject's electrophysiological data (e.g., EEG), to the chosen speech signal features based on data for which the attended speech is known.…”
Section: Introductionmentioning
confidence: 99%
“…The strength of these direct recordings for examining auditory selective attention is exemplified by the study of Mesgarani and Chang (2012), which unequivocally demonstrated enhancement of neural activity related to the attended stream at the expense of the ignored stream within a multitalker environment. The robust nature of auditory activity modulated by selective attention was further emphasized in a study of spatial selective attention, where activity recorded from a single electrode over the superior temporal gyrus (STG) was sufficient to predict the attended stream with above-chance accuracy (Dijkstra et al, 2015). …”
Section: Introductionmentioning
confidence: 99%