2017
DOI: 10.3389/fnins.2017.00550
|View full text |Cite
|
Sign up to set email alerts
|

Kernel-Based Relevance Analysis with Enhanced Interpretability for Detection of Brain Activity Patterns

Abstract: We introduce Enhanced Kernel-based Relevance Analysis (EKRA) that aims to support the automatic identification of brain activity patterns using electroencephalographic recordings. EKRA is a data-driven strategy that incorporates two kernel functions to take advantage of the available joint information, associating neural responses to a given stimulus condition. Regarding this, a Centered Kernel Alignment functional is adjusted to learning the linear projection that best discriminates the input feature set, opt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 65 publications
0
11
0
Order By: Relevance
“…However, the grid search needed for tuning the free parameters increases the computational complexity exponentially to , where are related to grids defined over characteristic and similarity kernel widths. See that the performance of AKL-ABC depends on and G ; however, in practice, the CKA and LNS algorithm have a fast convergence [ 32 , 35 ].…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, the grid search needed for tuning the free parameters increases the computational complexity exponentially to , where are related to grids defined over characteristic and similarity kernel widths. See that the performance of AKL-ABC depends on and G ; however, in practice, the CKA and LNS algorithm have a fast convergence [ 32 , 35 ].…”
Section: Resultsmentioning
confidence: 99%
“…Concerning the feature space , we assess the similarity via the Gaussian kernel function , , to build the matrix where and is a feature mapping. Here, to perform the pairwise comparison between simulations in we use also the Mahalanobis distance of the form [ 32 ]: where stands for the inverse covariance matrix of and ( ). In this sense, we use the information respecting the similarities among candidates in , represented via , to state a notion of similarity between the simulations and a target observation in , represented via .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The authors plan to enhance the Gaussian functional connectivity that was developed for feature extraction as future work, allowing a better understanding of their impact and interaction on BCI-related tasks. To identify potential non-learners, the efforts can be directed toward a twofold aim: to enhance the feature extraction by profiting from more elaborate methods for measuring multivariate similarity, like centered kernel alignment [58,59], and to explore the robust estimation approaches based on information metrics (like correntropy) for dealing better with the variability [55,60,61]. Besides, modeling the temporal-dependencies within each trail to compute the FC is an exciting research line.…”
Section: Discussionmentioning
confidence: 99%
“…In particular, we match both estimated kernel-embeddings through the centered kernel alignment (CKA), as detailed in [ 40 ]: where the kernel is obtained from the matrix of predicted label probabilities build by concatenating across the trial and subject sets all label probability vectors .…”
Section: Materials and Methodsmentioning
confidence: 99%