From Acceleration to Rhythmicity: Smartphone-Assessed Movement Predicts Properties of Music

Melanie Irrgang, Jochen Steffens, Hauke Egermann

Research output: Contribution to journalArticlepeer-review

Abstract

Music moves us. Yet, querying music is still a disembodied process in most music rec- ommender scenarios. New mediation technologies like querying music by movement would take account of the empirically well founded knowledge of embodied mu- sic cognition. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, and regularity were extracted from accelerometer data to predict the musical qualities “rhythmicity”, “pitch level + range” and “complexity” assessed by three music experts. Motion features se- lected by a stepwise AIC model predicted the musical properties to the following degrees “rhythmicity” (R2 = .45), “pitch level and range” (R2 = .06) and “com- plexity” (R2 = .15). We conclude that (rhythmic) music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.
Original languageEnglish
Pages (from-to)1-15
Number of pages15
JournalJournal of New Music Research
Early online date30 Jan 2020
DOIs
Publication statusE-pub ahead of print - 30 Jan 2020

Bibliographical note

© 2020 Informa UK Limited, trading as Taylor & Francis Group. This is an author-produced version of the published paper. Uploaded in accordance with the publisher’s self-archiving policy. Further copying may not be permitted; contact the publisher for details.

Cite this