The recognition of object categories is effortlessly accomplished in everyday life, yet its neural underpinnings remain not fully understood. In this electroencephalography (EEG) study, we used single-trial classification to perform a Representational Similarity Analysis (RSA) of categorical representation of objects in human visual cortex. Brain responses were recorded while participants viewed a set of 72 photographs of objects with a planned category structure. The Representational Dissimilarity Matrix (RDM) used for RSA was derived from confusions of a linear classifier operating on single EEG trials. In contrast to past studies, which used pairwise correlation or classification to derive the RDM, we used confusion matrices from multi-class classifications, which provided novel self-similarity measures that were used to derive the overall size of the representational space. We additionally performed classifications on subsets of the brain response in order to identify spatial and temporal EEG components that best discriminated object categories and exemplars. Results from category-level classifications revealed that brain responses to images of human faces formed the most distinct category, while responses to images from the two inanimate categories formed a single category cluster. Exemplar-level classifications produced a broadly similar category structure, as well as sub-clusters corresponding to natural language categories. Spatiotemporal components of the brain response that differentiated exemplars within a category were found to differ from those implicated in differentiating between categories. Our results show that a classification approach can be successfully applied to single-trial scalp-recorded EEG to recover fine-grained object category structure, as well as to identify interpretable spatiotemporal components underlying object processing. Finally, object category can be decoded from purely temporal information recorded at single electrodes.
While magnetoencephalography (MEG) is widely used to identify spatial locations of brain activations associated with various tasks, classification of single trials in stimulus-locked experiments remains an open subject. Very significant single-trial classification results have been published using electroencephalogram (EEG) data, but in the MEG case, the weakness of the magnetic fields originating from the relevant sources relative to external noise, and the high dimensionality of the data are difficult obstacles to overcome. We present here very significant MEG single-trial mean classification rates of words. The number of words classified varied from seven to nine and both visual and auditory modalities were studied. These results were obtained by using a variety of blind sources separation methods: spatial principal components analysis (PCA), Infomax independent components analysis (Infomax ICA) and second-order blind identification (SOBI). The sources obtained were classified using two methods, linear discriminant classification (LDC) and v-support vector machine (v-SVM). The data used here, auditory and visual presentations of words, presented nontrivial classification problems, but with Infomax ICA associated with LDC we obtained high classification rates. Our best single-trial mean classification rate was 60.1% for classification of 900 single trials of nine auditory words. On two-class problems rates were as high as 97.5%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.