2020
DOI: 10.1101/2020.01.14.900688
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SpikeForest: reproducible web-facing ground-truth validation of automated neural spike sorters

Abstract: Spike sorting is a crucial but time-intensive step in electrophysiological studies of neuronal activity. While there are many popular software packages for spike sorting, there is little consensus about which are the most accurate under different experimental conditions. SpikeForest is an open-source and reproducible software suite that benchmarks the performance of automated spike sorting algorithms across an extensive, curated database of electrophysiological recordings with ground truth, displaying results … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
37
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(37 citation statements)
references
References 66 publications
0
37
0
Order By: Relevance
“…For these reasons, we believe that spike sorting validation cannot be solely limited to simulated recordings. In a recent effort for spike sorting validation, named SpikeForest [31], the authors have gathered more than 650 groundtruth recordings belonging to different categories: paired recordings, simulated synthetic recordings (including MEArec-generated datasets), hybrid recordings, and manually sorted data. We think that a systematic benchmark of spike sorting tools will benefit from this larger collection of diverse groundtruth recordings, and in this light, MEArec can provide high-quality simulated datasets to aid this purpose.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…For these reasons, we believe that spike sorting validation cannot be solely limited to simulated recordings. In a recent effort for spike sorting validation, named SpikeForest [31], the authors have gathered more than 650 groundtruth recordings belonging to different categories: paired recordings, simulated synthetic recordings (including MEArec-generated datasets), hybrid recordings, and manually sorted data. We think that a systematic benchmark of spike sorting tools will benefit from this larger collection of diverse groundtruth recordings, and in this light, MEArec can provide high-quality simulated datasets to aid this purpose.…”
Section: Discussionmentioning
confidence: 99%
“…The combination of MEArec and SpikeInterface represents a powerful tool for systematically testing and comparing spike sorter performances with respect to several complications of extracellular recordings. MEArec simulations, in combination with SpikeInterface, are already being used by other groups to benchmark and compare spike sorting algorithms by the SpikeForest project [31].…”
mentioning
confidence: 99%
“…Despite the development and widespread use of automatic spike sorters, there still exist no clear standards for how spike sorting should be performed or evaluated [62,11,18,47]. Research labs that are beginning to experiment with high-density extracellular recordings have to choose from a multitude of spike sorters, data processing algorithms, file formats, and curation tools just to analyze their first recording.…”
Section: Introductionmentioning
confidence: 99%
“…Research labs that are beginning to experiment with high-density extracellular recordings have to choose from a multitude of spike sorters, data processing algorithms, file formats, and curation tools just to analyze their first recording. As trying out multiple spike sorting pipelines is time-consuming and technically challenging, many labs choose one and stick to it as their de facto solution [47]. This has led to a fragmented software ecosystem which challenges reproducibility, benchmarking, and collaboration among different research labs.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation