2009
DOI: 10.1080/09540090902733756
|View full text |Cite
|
Sign up to set email alerts
|

Information dynamics: patterns of expectation and surprise in the perception of music

Abstract: Measures such as entropy and mutual information can be used to characterise random processes. In this paper, we propose the use of several time-varying information measures, computed in the context of a probabilistic model which evolves as a sample of the process unfolds, as a way to characterise temporal structure in music. One such measure is a novel predictive information rate which we conjecture may provide a conceptually simple explanation for the 'inverted-U' relationship often found between simple measu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
106
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 67 publications
(106 citation statements)
references
References 24 publications
(18 reference statements)
0
106
0
Order By: Relevance
“…Information Dynamic approaches have so far treated events as successive and accretive, without taking regard for their relative timing, or directly addressing issues of temporal autocorrelation. Indeed, much of the Information Dynamics work has been done on monophonic isochronic music, such as Glass' Gradus (Abdallah & Plumbley, 2009;Potter, et al, 2007). The Information Dynamic approach can be enhanced further by superimposed time series analyses, as well as by awareness that notationally-or musicologically-defined features may or may not be perceptible and readily categorized, and hence may or may not contribute new information.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Information Dynamic approaches have so far treated events as successive and accretive, without taking regard for their relative timing, or directly addressing issues of temporal autocorrelation. Indeed, much of the Information Dynamics work has been done on monophonic isochronic music, such as Glass' Gradus (Abdallah & Plumbley, 2009;Potter, et al, 2007). The Information Dynamic approach can be enhanced further by superimposed time series analyses, as well as by awareness that notationally-or musicologically-defined features may or may not be perceptible and readily categorized, and hence may or may not contribute new information.…”
Section: Discussionmentioning
confidence: 99%
“…The time series analysis methodology developed in this paper can support ongoing work to understand the perception of music in terms of dynamically changing information content (Abdallah & Plumbley, 2009;Pearce & Wiggins, 2006;Potter, Wiggins, & Pearce, 2007; reviewed by Wiggins, Pearce, & Müllensiefen, 2009). In an information dynamics approach it is assumed that the predictive capacity and decisions of an observer concerning future events in the musical stream are continuously evolving as new information is assimilated into the observer's (potentially individualistic) data probability structure.…”
Section: Discussionmentioning
confidence: 99%
“…It will also be useful to examine ways of tailoring IDyOM specifically for segmentation, including a metrically-based rather than an event-based representation of time, optimising the derived features that it uses to make event predictions, and using other information-theoretic measures such as entropy or predictive information (Abdallah and Plumbley 2009). One of the attractive features of the model is that such measures (and the learning models on which they rely) are in no sense domain-specific and so can be applied generally.…”
Section: Discussionmentioning
confidence: 99%
“…It also means that the sequence of surprisingness values is itself uncorrelated in time. This is in marked constrast with the situation for Markov chains [4], where, in general, the expected suprise depends on the previous observation and thus varies in time, reflecting the observer's varying levels of uncertainty about the next observation. In a Gaussian processes, this predictive uncertainty is constant and therefore does not provide any useful structural analysis of the sequence.…”
Section: Process Information Measures For Gaussian Processesmentioning
confidence: 98%
“…Addressing the observation that some processes with long-range dependencies have infinite excess entropy [2], Bialek et al [3] introduced the predictive information as the mutual information between a finite segment of a process and the infinite future following it, and studied its behaviour, especially in relation to learning in statistical models. In previous work [4], we defined the predictive information rate (PIR) of a random process as the average information in one observation about future observations yet to be made given the observations made so far; thus, it quantifies the new information in observations made sequentially. The PIR captures a dimension of temporal structure that is not accounted for by previously proposed measures.…”
Section: Introductionmentioning
confidence: 99%