TY - GEN
T1 - Smartphone-Assessed Movement Predicts Music Properties
T2 - Towards Integrating Embodied Music Cognition into Music Recommender Services via Accelerometer
AU - Irrgang, Melanie
AU - Steffens, Jochen
AU - Egermann, Hauke Wolfgang
N1 - © 2018, Author(s). This is an author-produced version of the published paper. Uploaded in accordance with the publisher’s self-archiving policy. Further copying may not be permitted; contact the publisher for details.
PY - 2018/6/28
Y1 - 2018/6/28
N2 - Numerous studies have shown a close relationship between move- ment and music [7], [17], [11], [14], [16], [3], [8]. That is why Leman calls for new mediation technologies to query music in a corporeal way [9]. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be re- lated to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Mo- tion features related to tempo, smoothness, size, regularity, and direction were extracted from accelerometer data to predict the musical qualities “rhythmicity", “pitch level + range" and "complex- ity“ assessed by three music experts. Motion features selected by a 20-fold lasso predicted the musical properties to the following degrees “rhythmicity" (R2 : .47), pitch level and range (R2 : .03) and complexity (R2 : .10). As a consequence, we conclude that music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.
AB - Numerous studies have shown a close relationship between move- ment and music [7], [17], [11], [14], [16], [3], [8]. That is why Leman calls for new mediation technologies to query music in a corporeal way [9]. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be re- lated to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Mo- tion features related to tempo, smoothness, size, regularity, and direction were extracted from accelerometer data to predict the musical qualities “rhythmicity", “pitch level + range" and "complex- ity“ assessed by three music experts. Motion features selected by a 20-fold lasso predicted the musical properties to the following degrees “rhythmicity" (R2 : .47), pitch level and range (R2 : .03) and complexity (R2 : .10). As a consequence, we conclude that music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.
KW - Accelerometer
KW - Embodied cognition and movement
KW - Music information retrieval
KW - Smartphone
UR - http://www.scopus.com/inward/record.url?scp=85055328003&partnerID=8YFLogxK
U2 - 10.1145/3212721.3212852
DO - 10.1145/3212721.3212852
M3 - Conference contribution
T3 - ACM Proceedings
BT - Proceedings of the 5th International Conference on Movement and Computing, MOCO 2018
PB - ACM
ER -