Many cognitive and behavioral tasks -such as interval timing, spatial navigation, motor control and speech -require the execution of preciselytimed sequences of neural activation that cannot be fully explained by a succession of external stimuli. We use a reservoir computing framework to explain how such neural sequences can be generated and employed in temporal tasks. We propose a general solution for recurrent neural networks to autonomously produce rich patterns of activity by providing a multi-periodic oscillatory signal as input. We show that the model accurately learns a variety of tasks, including speech generation, motor control and spatial navigation. Further, the model performs temporal rescaling of natural spoken words and exhibits sequential neural activity commonly found in experimental data involving temporal processing. In the context of spatial navigation, the model learns and replays compressed sequences of place cells and captures features of neural activity such as the emergence of ripples and theta phase precession. Together, our findings suggest that combining oscillatory neuronal inputs with different frequencies provides a key mechanism to generate precisely timed sequences of activity in recurrent circuits of the brain. 1 2 3 4 5 6 7 8 9 10 Reservoir computing | Neural oscillations | Temporal processing | Balanced networks 1. Introduction 1 Virtually every aspect of sensory, cognitive and motor processing in biological organisms involves operations unfolding in time 2(1). In the brain, neuronal circuits must represent time on a variety of scales, from milliseconds to minutes and longer circadian 3 rhythms (2). Despite increasingly sophisticated models of brain activity, time representation remains a challenging problem in 4 computational modelling (3, 4). 5 Recurrent neural networks offer a promising avenue to detect and produce precisely timed sequences of activity (5). However, 6 it is challenging to train these networks due to their complexity (6), particularly when operating in a chaotic regime associated 7 with biological neural networks (7, 8). 8 One avenue to address this issue is with the use of reservoir computing (RC) (9, 10). Under this framework, a recurrent 9 network (the reservoir) projects onto a read-out layer whose synaptic weights are adjusted to produce a desired response. 10 However, while RC can capture some behavioral and cognitive processes (11)(12)(13), it often relies on biologically implausible 11 mechanisms (14). Further, current RC implementations offer little insight to understand how the brain generates activity that 12 does not follow a strict rhythmic pattern (1, 5). That is because RC models are either restricted to learning periodic functions, 13 or require an aperiodic input to generate an aperiodic output, thus leaving the neural origins of aperiodic activity unresolved 14 (5). A solution to this problem is to train the recurrent connections of the reservoir to stabilize innate patterns of activity (12), 15 but this approach is more com...