“…The numerical handling is instead carried out by a PyTorch backend, that enables SHIP to inherit PyTorch advantages and functionalities: (i) access to optimized libraries enabling fast matricial calculation, (ii) the network optimization algorithms and S1 for further information). The list of platform appears in order, from the earliest release to the most recent, and it encompasses the following: GENESIS (Bower and Beeman, 2007), XPPAUT (Bard, 1996), NEURON (Hines et al, 2020), NCS (Drewes, 2005;Hoang et al, 2013), EDLUT (Ros et al, 2006), NEST (Gewaltig and Diesmann, 2007), CARLSim (Niedermeier et al, 2022), NeMo (Fidjeland et al, 2009), CNS (Poggio et al, 2010), GeNN (Yavuz et al, 2016), N2D2 (Bichler et al, 2017), Nengo (Bekolay et al, 2014), Auryn (Zenke and Gerstner, 2014), Brian 2 (Stimberg et al, 2019), NEVESIM (Pecevski et al, 2014), ANNarchy (Vitay et al, 2015), MegaSim (Stromatias et al, 2017), BindsNET (Hazan et al, 2018), DynaSim (Sherfey et al, 2018), SPIKE (Ahmad et al, 2018), LSNN (Bellec et al, 2018), cuSNN (Paredes-Valles et al, 2020), Slayer (Shrestha and Orchard, 2018), RockPool (Muir et al, 2019), SpykeTorch (Mozafari et al, 2019), PySNN (Büller, 2020), s2net (Zimmer et al, 2019), sinabs (Lenz and Sheik, 2020), DECOLLE (Kaiser et al, 2020), Spice (Bautembach et al, 2020), Spiking Jelly (Fang et al, 2020), Sapicore (Moyal et al, 2021), Norse (Pehle and Pedersen, 2021), Lava (Richter et al, 2021), snnTorch…”