We address how listeners perceive temporal regularity in music performances, which are rich in temporal irregularities. A computational model is described in which a small system of internal self-sustained oscillations, operating at different periods with specific phase and period relations, entrains to the rhythms of music performances. Based on temporal expectancies embodied by the oscillations, the model predicts the categorization of temporally changing event intervals into discrete metrical categories, as well as the perceptual salience of deviations from these categories. The model's predictions are tested in two experiments using piano performances of the same music with different phrase structure interpretations (Experiment 1) or different melodic interpretations (Experiment 2). The model successfully tracked temporal regularity amidst the temporal fluctuations found in the performances. The model's sensitivity to performed deviations from its temporal expectations compared favorably with the performers' structural (phrasal and melodic) intentions. Furthermore, the model tracked normal performances (with increased temporal variability) better than performances in which temporal fluctuations associated with individual voices were removed (with decreased variability). The small, systematic temporal irregularities characteristic of human performances (chord asynchronies) improved tracking, but randomly generated temporal irregularities did not. These findings suggest that perception of temporal regularity in complex musical sequences is based on temporal expectancies that adapt in response to temporally fluctuating input.