TY - JOUR
T1 - From Acceleration to Rhythmicity
T2 - Smartphone-Assessed Movement Predicts Properties of Music
AU - Irrgang, Melanie
AU - Steffens, Jochen
AU - Egermann, Hauke
N1 - © 2020 Informa UK Limited, trading as Taylor & Francis Group. This is an author-produced version of the published paper. Uploaded in accordance with the publisher’s self-archiving policy. Further copying may not be permitted; contact the publisher for details.
PY - 2020/1/30
Y1 - 2020/1/30
N2 - Music moves us. Yet, querying music is still a disembodied process in most music rec- ommender scenarios. New mediation technologies like querying music by movement would take account of the empirically well founded knowledge of embodied mu- sic cognition. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, and regularity were extracted from accelerometer data to predict the musical qualities “rhythmicity”, “pitch level + range” and “complexity” assessed by three music experts. Motion features se- lected by a stepwise AIC model predicted the musical properties to the following degrees “rhythmicity” (R2 = .45), “pitch level and range” (R2 = .06) and “com- plexity” (R2 = .15). We conclude that (rhythmic) music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.
AB - Music moves us. Yet, querying music is still a disembodied process in most music rec- ommender scenarios. New mediation technologies like querying music by movement would take account of the empirically well founded knowledge of embodied mu- sic cognition. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, and regularity were extracted from accelerometer data to predict the musical qualities “rhythmicity”, “pitch level + range” and “complexity” assessed by three music experts. Motion features se- lected by a stepwise AIC model predicted the musical properties to the following degrees “rhythmicity” (R2 = .45), “pitch level and range” (R2 = .06) and “com- plexity” (R2 = .15). We conclude that (rhythmic) music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.
U2 - 10.1080/09298215.2020.1715447
DO - 10.1080/09298215.2020.1715447
M3 - Article
SN - 0929-8215
SP - 1
EP - 15
JO - Journal of New Music Research
JF - Journal of New Music Research
ER -