Tel. +32 2 555 3286. 25 26 Highlights 27• The brain tracks phrasal and syllabic rhythmicity of self-produced (read) speech. 28 • Tracking of phrasal structures is attenuated during reading compared with listening. 29 • Speech rhythmicity mainly drives brain activity during reading and listening. 30 • Brain activity drives syllabic rhythmicity more during reading than listening.Abstract 33 To gain novel insights into how the human brain processes self-produced auditory 34 information during reading aloud, we investigated the coupling between neuromagnetic 35 activity and the temporal envelope of the heard speech sounds (i.e., speech brain tracking) in 36 a group of adults who 1) read a text aloud, 2) listened to a recording of their own speech (i.e., 37 playback), and 3) listened to another speech recording. Coherence analyses revealed that, 38 during reading aloud, the reader's brain tracked the slow temporal fluctuations of the speech 39 output. Specifically, auditory cortices tracked phrasal structure (<1 Hz) but to a lesser extent 40 than during the two speech listening conditions. Also, the tracking of syllable structure (4-8 41 Hz) occurred at parietal opercula during reading aloud and at auditory cortices during 42 listening. Directionality analyses based on renormalized partial directed coherence revealed 43 that speech brain tracking at <1 Hz and 4-8 Hz is dominated by speech-to-brain directional 44 coupling during both reading aloud and listening, meaning that speech brain tracking mainly 45 entails auditory feedback processing. Nevertheless, brain-to-speech directional coupling at 4-46 8 Hz was enhanced during reading aloud compared with listening, likely reflecting speech 47 monitoring before production. Altogether, these data bring novel insights into how auditory 48 verbal information is tracked by the human brain during perception and self-generation of 49 connected speech. 50 51 Keywords 52 Reading; speech perception; speech production; connected speech; speech brain tracking; 53 magnetoencephalography 54 55 56 57 3 79 brain activity during connected speech production. Previous magnetoencephalography 80 (MEG) studies focusing on connected speech listening demonstrated speech-sensitive 81 coupling between the slow modulations of the speaker's voice and listeners' (mainly auditory) 82 4cortex activity (Bourguignon et al., 2013; Clumeck et al., 2014; Ding et al., 2016; Gross et 83 al., 2013; Molinaro et al., 2016; Peelle et al., 2013; Vander Ghinst et al., 2016). This coupling 84 henceforth referred to as speech brain tracking, mainly occurs at syllable (4-8 Hz) and 85 phrasal/sentential (<1 Hz) rates. It is considered to play a pivotal role in parsing connected 86 speech into smaller units (i.e., syllables or phrases/sentences) to promote subsequent speech 87 recognition (Park et al., 2018; Zion Golumbic et al., 2012). Additionally, it might help predict 88 the precise timing of events in the speech stream such as syllables and phrases/sentences 89 (Zion Golumbic et al., 2012). Suc...