2018
DOI: 10.1088/2057-1976/aada1a
|View full text |Cite
|
Sign up to set email alerts
|

Online classification of the near-infrared spectroscopy fast optical signal for brain-computer interfaces

Abstract: Objective. The fast optical signal (FOS), measured with near-infrared spectroscopy (NIRS), has high temporal and competitive spatial resolution which provides an opportunity for a novel braincomputer interface modality. However, the reliability of the FOS has been debated due to its low signal-to-noise ratio. Approach. This study examined the feasibility of automatically classifying the prefrontal FOS response during a visual oddball task. FOS measurements were collected from 15 participants during 3 offline a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 61 publications
1
6
0
Order By: Relevance
“…Both algorithms produced similar accuracies (76% for SVM and 75% for LDA); however, SVM resulted in a better balance between sensitivity and specificity (79% and 71%, respectively) compared to LDA (83% and 58%, respectively). Overall, these estimates of classification accuracy are in-line with previous reports (Naseer and Hong, 2015a) and meet the minimum threshold of 70% for a BCI to be considered effective for communication (Proulx et al, 2018). Although the classification accuracy is comparable to results from other fNIRS studies involving various activation tasks for mental communication (Naseer and Hong, 2015b), it was less than the accuracy reported in an fMRI study involving the same tennis imagery task (Monti et al, 2010).…”
Section: Discussionsupporting
confidence: 86%
“…Both algorithms produced similar accuracies (76% for SVM and 75% for LDA); however, SVM resulted in a better balance between sensitivity and specificity (79% and 71%, respectively) compared to LDA (83% and 58%, respectively). Overall, these estimates of classification accuracy are in-line with previous reports (Naseer and Hong, 2015a) and meet the minimum threshold of 70% for a BCI to be considered effective for communication (Proulx et al, 2018). Although the classification accuracy is comparable to results from other fNIRS studies involving various activation tasks for mental communication (Naseer and Hong, 2015b), it was less than the accuracy reported in an fMRI study involving the same tennis imagery task (Monti et al, 2010).…”
Section: Discussionsupporting
confidence: 86%
“…This is probably a major advantage as FOI may render an ideal foundation technology for brain imaging data fusion [228]. With the help of this methodology, multiple researchers [204], [225], [229]- [231] have shown that fast optical signals can be measured consistently with combined high spatial and temporal resolution. However, some research groups have also been questioned these possibilities [232], [233].…”
Section: Fast Optical Imagingmentioning
confidence: 99%
“…Among the variety of classifiers available, the support vector machine (SVM) has been demonstrated to be highly effective for classifying brain signals [22][23][24][25][26][27]. Proulx et al (2018), for example, used SVM and Linear Discriminant Analysis (LDA) using FOS (light intensity and phase delay, a measure of average photons time of flight using frequency-domain NIR systems) to distinguish between unusual and common responses [28]. The outcomes were combined using a weighted majority vote.…”
Section: Introductionmentioning
confidence: 99%
“…The outcomes were combined using a weighted majority vote. FOS responses to rare and common images were classified among participants either offline, obtaining an average balanced accuracy of 62.5%, or online, with an average balanced accuracy of 63.6% [28]. The current study aimed to investigate the capability of combining FOS with SVM for single-trial retinotopy estimation in BCI applications.…”
Section: Introductionmentioning
confidence: 99%