“…Sequentially constructed algorithms, such as those building upon multiple basic dense layers (Mahallati et al, 2019 ; Yeganegi et al, 2020 ) and convolutional (Li et al, 2020b ) and recurrent layers (Rácz et al, 2020 ) require an expansive repository, although by weights' and activation functions' binarization, complexity may be cut back (Valencia and Alimohammad, 2021 ), or parallelization by graphical processing units may take place (Tam and Yang, 2018 ). These layers may be constructed in different ways, mainly in order to mitigate or abandon the need for hand-labeled neural data throughout training: autoencoders (Weiss, 2019 ; Radmanesh et al, 2021 ; Rokai et al, 2021 ) or networks generated by adversarial (Wu et al, 2019 ; Ciecierski, 2020 ) or reinforcement learning paradigms (Salman et al, 2018 ; Moghaddasi et al, 2020 ) have successfully clustered features originating from noisiest datasets. Likewise, a more sophisticated learning-based method may even incorporate multiple steps of spike sorting, resolving detection, feature extraction, and clustering as a close-packed solution (Eom et al, 2021 ; Rokai et al, 2021 ), although manual curation is advisable (Horváth et al, 2021 ).…”