By the same authors

High-Level Chord Features Extracted From Audio Can Predict Perceived Musical Expression

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

Department/unit(s)

Publication details

Title of host publicationExtended abstracts for the Late-Breaking Demo Session of the 18th In- ternational Society for Music Information Retrieval Conference, Suzhou, China, 2017
DatePublished - 2017
Number of pages2
Original languageEnglish

Abstract

We investigated the relationship between high-level chord features and the perceived semantic and emotional expres- sion of musical pieces in the context of music branding. Therefore, we first developed high-level chord features based on musicological considerations and novel MIR technologies. Inter alia, these features represent the num- ber of chords, the proportion of major/minor chords, and the frequency of certain cadences and turnarounds. The validity of these features for predicting listeners’ perceived musical expression beyond genre information was subse- quently tested by means of data from two online listening experiments, where musical expression of 549 music titles had been rated on four factors, Easy-going, Joyful, Authen- tic, and Progressive. Results show that in all four models chord features significantly improved prediction results. Most important features turned out to be those representing the number of (unique) chords and the proportion of minor chords. Implications of results are discussed, and future work is outlined.

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations