Digital Signal Processing Reference
In-Depth Information
To classify a chord, only the pitch classes, i.e., the note names without octave
number, of the notes involved are relevant. A variety of different chord types exists
and is characterised by the size of intervals between notes of the chords.
The automatic recognition and transcription of musical chords and their progres-
sion has manifold application potential:
In spontaneous improvisation sessions of musicians such as 'jams', the progres-
sion can be analysed and stored as a lead sheet, or media players can automatically
identify and show the current chord in a musical piece for play-along—by humans
or even the computer. Knowledge of the chord structure can also be used as meta-
information in MIR tasks. A good example is genre recognition, as certain genres
prefer typical progression patterns (e.g., Jazz: second, fifth, tonic successions or
Blues: tonic, fourth, fifth as dominant sept chord successions).
Another example is musical mood recognition (e.g., ratio of major and minor
chords—this will be shown in Sect. 11.7 ). Obviously, also key recognition can ben-
efit from this information—and vice versa, which is why a simultaneous key and
chord analysis seems promising. Structure analysis, e.g., for chorus retrieval [ 30 ]
(cf. Sect. 11.6 ) can also be based on the chord progression, as it often differs between
different parts of a musical piece such as verse, bridge, and chorus. Moreover, DJs
can be provided with automatic on-line synthesis of chord matching notes as very
low sub-basses or arpeggios, and with tools that allow to blend music with matched
chord structure.
Finally, music similarity analysis, e.g., for plagiarism retrieval can be based on
chord information. As an example, the chord progression of Johann Pachelbel's
Canon in D ”(“ Canon per 3 Violini e Basso ”), is found in multiple contemporary
popular songs, such as “ Go West ” first by the Village People, later covered by the
Petshop Boys, or Ralph McTell's “ Streets of London ”, The Farm's “ All Together
Now ”, Green Day's “ Basket Case ”, Mattafix's “ Big City Life ” or Juanes's “ Volverte
aVer ”.
To save labour-some manual labelling, an automatic beat-synchronous and data-
driven approach is introduced here. The approach bases on the findings for tempo
determination and key determination described in the previous sections. Early auto-
matic chord recognition was based on pitch class profiles [ 100 ] (cf. Sect. 6.2.21 ) .
Later, HMMs were proven highly suited, e.g., in [ 104 , 117 ]. Obviously, context
modelling can improve the recognition rate [ 132 ], as chords tend to follow chords
with certain properties such as neighbourhood in the circle of fifths (cf. Sect. 11.4 ).
Exploiting these bases, results on realistic data are shown including a progression
LM trained on a large corpus of 16 k songs to show reachable results on a database
of mixed original recordings.
11.5.1 ChoRD Database
The Chord Recognition Database , respectively ChoRD database was introduced
in [ 29 ].
 
Search WWH ::




Custom Search