2008
DOI: 10.4018/jcini.2008070102
|View full text |Cite
|
Sign up to set email alerts
|

Artificial Neural Networks that Classify Musical Chords

Abstract: An artificial neural network was trained to classify musical chords into four categories—major, dominant seventh, minor, or diminished seventh—independent of musical key. After training, the internal structure of the network was analyzed in order to determine the representations that the network was using to classify chords. It was found that the first layer of connection weights in the network converted the local representations of input notes into distributed representations that could be described in musica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(10 citation statements)
references
References 24 publications
0
10
0
Order By: Relevance
“…Connectionist networks can accomplish a variety of tasks that require classification of basic elements of Western music (e.g., pitch, tonality, and harmony). Artificial neural networks have been trained to classify chords (Laden & Keefe, 1989; Yaremchuk & Dawson, 2005, 2008), to assign notes to structures similar to the tonal hierarchy (Leman, 1991; Scarborough, Miller, & Jones, 1989), to model the effects of musical expectations on musical perception (Bharucha, 1987; Bharucha & Todd, 1989), to add harmony to melodies (Berkeley & Raine, 2011; Shibata, 1991), to determine the musical key of a melody (Griffith, 1995), to identify a melody even when it has been transposed into a different key (Benuskova, 1995; Bharucha & Todd, 1989; Page, 1994; Stevens & Latimer, 1992), and to detect the chord patterns in a composition (Gjerdingen, 1992). Artificial neural networks can also handle other important aspects of music that are independent of tonality, such as assigning rhythm and meter (Desain & Honing, 1989; Griffith & Todd, 1999; Large & Kolen, 1994) or generating preferences for, or expectancies of, particular rhythmic patterns (Gasser, Eck, & Port, 1999).…”
Section: Training Key-finding Network On Tone Profilesmentioning
confidence: 99%
“…Connectionist networks can accomplish a variety of tasks that require classification of basic elements of Western music (e.g., pitch, tonality, and harmony). Artificial neural networks have been trained to classify chords (Laden & Keefe, 1989; Yaremchuk & Dawson, 2005, 2008), to assign notes to structures similar to the tonal hierarchy (Leman, 1991; Scarborough, Miller, & Jones, 1989), to model the effects of musical expectations on musical perception (Bharucha, 1987; Bharucha & Todd, 1989), to add harmony to melodies (Berkeley & Raine, 2011; Shibata, 1991), to determine the musical key of a melody (Griffith, 1995), to identify a melody even when it has been transposed into a different key (Benuskova, 1995; Bharucha & Todd, 1989; Page, 1994; Stevens & Latimer, 1992), and to detect the chord patterns in a composition (Gjerdingen, 1992). Artificial neural networks can also handle other important aspects of music that are independent of tonality, such as assigning rhythm and meter (Desain & Honing, 1989; Griffith & Todd, 1999; Large & Kolen, 1994) or generating preferences for, or expectancies of, particular rhythmic patterns (Gasser, Eck, & Port, 1999).…”
Section: Training Key-finding Network On Tone Profilesmentioning
confidence: 99%
“…To build a training set for teaching the Coltrane changes to an artificial neural network, we discovered a graphical representation of this progression, which is provided in Figure 3 below. This representation takes advantage of the fact that previous interpretations of musical neural networks (Yaremchuk & Dawson, 2008) have revealed musical circles analogous to, but different from, the circle of fifths in Figure 1. Some networks encode musical knowledge using the four circles of major thirds presented in Figure 2.…”
Section: The Coltrane Changesmentioning
confidence: 99%
“…each unit was analogous to a key on a piano). Each output unit used a non-linear, Gaussian activation function, as was the case in previous studies of this type (Yaremchuk & Dawson, 2008). We trained this network in a similar fashion to the method used to train the II-V-I network mentioned earlier: the network was presented a chord in the progression, and an error-correcting rule was used to teach the network to output the next chord in the progression.…”
Section: Perceptrons Can Learn Coltrane Changesmentioning
confidence: 99%
See 1 more Smart Citation
“…The input chords were encoded with a pitch class representation (Laden & Keefe, 1989;Yaremchuk & Dawson, 2008). In a pitch class representation, only 12 input units are employed, one for each of the 12 different notes that can appear in a scale.…”
Section: Chord Classification By a Multilayer Perceptronmentioning
confidence: 99%