“…An unbounded version of LTNs, termed rectified linear units (ReLU), is widely studied in the context of machine learning [28], [29]. In the context of neuroscience, it has been used to model diverse brain states, including goal-driven selective attention [30], [31] and epilepsy [32]. In particular, recent works [30], [31], [33] have studied structural conditions for oscillations in networks of EI pairs by directly linking the lack of stable equilibria to the presence of oscillations.…”