Abstract
Introduction
Several studies showed the close relationship between movement and music (cf. Grahn and Rowe, 2009 or Zatorre, Chen and Penhune, 2007) and proclaimed the body to be an active contributor in meaning formation (Maes, Leman, Palmer and Wanderley, 2014). That is why Leman (2007) calls for new non-verbal, embodied possibilities to describe music and its experience as well as technologies to query music in a corporeal way.
Research Question
The goal of the presented study was to explore how and to which extent movements captured by mobile-device generated motion sensor data can be related to musical qualities. The outcome will show how one could perspectively generate music recommendations based on smartphone-assessed movement.
Method
Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) were tracked by Optitrack motion capture featuring eight cameras while moving a smartphone to 15 stimuli of 20s length presented in random order. After each piece, participants were interviewed about their movements and the music. A video camera additionally captured participants to be able to clean motion capture data and to record the interviews. The motion capture data served to assess the scope of the smartphone motion sensor data that represents a spatial fusion of movement from several body parts like legs and arms.
Several time compressed motion features related to tempo, smoothness, size, regularity, and direction were extracted from smartphone motion data and correlated with musical qualities like “rhythmicity”, “accentuation/articulation”, or “complexity” assessed by three music experts.
Results
Results revealed significant correlations between movement features and musical qualities. Participants, for example, applied swaying movement patterns when the backbeat was outstanding or synchronized their movement to the articulation of the music (smooth and long movements to legato music in contrast to short and accentuated movements to accentuated staccato music).
First observations also showed that participants with little dancing experience mostly moved to the rhythm of the music. Participants with more dancing experience were moved by complex melodies in particular. Therefore, we also want to assess how inter-individual differences moderate movement behavior in the future.
Conclusion
The presented study examined how movement evoked by music relates to different musical qualities. Though smartphone-assessed hand movement is not as differentiated as a full body setup from Motion Capture, it is still a feasible and promising approach to integrate embodied music cognition into music recommender systems.
References
Grahn, J. A., and Rowe, J. B. Feeling the beat: premotor and striatal interactions in musicians and non-musicians during beat perception. Journal of Neuroscience 29 (2009), 7540-7548.
Leman, M. Embodied Music Cognition and Mediation Technology. MIT Press,
London, 2007.
Maes, P.-J., Leman, M., Palmer, C., and Wanderley, M. Action-Based effects on Music
Perception. frontiers in Psychology 4 (2014).
Zatorre, R. J., Chen, J. L., and Penhune, V. When the brain plays music:
auditory-motor interactions in music perception and production. Nat. Rev.
Neurosci. 8 (2007), pp. 547-558.
Keywords: Embodiment, MIR, MoCap, Smartphone
Several studies showed the close relationship between movement and music (cf. Grahn and Rowe, 2009 or Zatorre, Chen and Penhune, 2007) and proclaimed the body to be an active contributor in meaning formation (Maes, Leman, Palmer and Wanderley, 2014). That is why Leman (2007) calls for new non-verbal, embodied possibilities to describe music and its experience as well as technologies to query music in a corporeal way.
Research Question
The goal of the presented study was to explore how and to which extent movements captured by mobile-device generated motion sensor data can be related to musical qualities. The outcome will show how one could perspectively generate music recommendations based on smartphone-assessed movement.
Method
Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) were tracked by Optitrack motion capture featuring eight cameras while moving a smartphone to 15 stimuli of 20s length presented in random order. After each piece, participants were interviewed about their movements and the music. A video camera additionally captured participants to be able to clean motion capture data and to record the interviews. The motion capture data served to assess the scope of the smartphone motion sensor data that represents a spatial fusion of movement from several body parts like legs and arms.
Several time compressed motion features related to tempo, smoothness, size, regularity, and direction were extracted from smartphone motion data and correlated with musical qualities like “rhythmicity”, “accentuation/articulation”, or “complexity” assessed by three music experts.
Results
Results revealed significant correlations between movement features and musical qualities. Participants, for example, applied swaying movement patterns when the backbeat was outstanding or synchronized their movement to the articulation of the music (smooth and long movements to legato music in contrast to short and accentuated movements to accentuated staccato music).
First observations also showed that participants with little dancing experience mostly moved to the rhythm of the music. Participants with more dancing experience were moved by complex melodies in particular. Therefore, we also want to assess how inter-individual differences moderate movement behavior in the future.
Conclusion
The presented study examined how movement evoked by music relates to different musical qualities. Though smartphone-assessed hand movement is not as differentiated as a full body setup from Motion Capture, it is still a feasible and promising approach to integrate embodied music cognition into music recommender systems.
References
Grahn, J. A., and Rowe, J. B. Feeling the beat: premotor and striatal interactions in musicians and non-musicians during beat perception. Journal of Neuroscience 29 (2009), 7540-7548.
Leman, M. Embodied Music Cognition and Mediation Technology. MIT Press,
London, 2007.
Maes, P.-J., Leman, M., Palmer, C., and Wanderley, M. Action-Based effects on Music
Perception. frontiers in Psychology 4 (2014).
Zatorre, R. J., Chen, J. L., and Penhune, V. When the brain plays music:
auditory-motor interactions in music perception and production. Nat. Rev.
Neurosci. 8 (2007), pp. 547-558.
Keywords: Embodiment, MIR, MoCap, Smartphone
Original language | English |
---|---|
Publication status | Published - 2017 |
Event | Jahrestagung der Deutschen Gesellschaft für Musikpsychologie - Hamburg, Germany Duration: 15 Sept 2017 → 17 Sept 2017 http://musikpsychologie.de/ |
Conference
Conference | Jahrestagung der Deutschen Gesellschaft für Musikpsychologie |
---|---|
Country/Territory | Germany |
City | Hamburg |
Period | 15/09/17 → 17/09/17 |
Internet address |