A Perceptual and Affective Evaluation of an Affectively Driven Engine for Video Game Soundtracking

Research output: Contribution to journalArticlepeer-review

Abstract

We report on a player evaluation of a pilot system for dynamic video game soundtrack generation. The system being evaluated generates music using an AI-based algorithmic composition technique to create score in real-time, in response to a continuously varying emotional trajectory dictated by gameplay cues. After a section of gameplay, players rated the system on a Likert scale according to emotional congruence with the narrative, and also according to their perceived immersion with the gameplay. The generated system showed a statistically meaningful and consistent improvement in ratings for emotional congruence, yet with a decrease in perceived immersion, which might be attributed to the marked difference in instrumentation between the generated music, voiced by a solo piano timbre, and the original, fully orchestrated soundtrack. Finally, players rated selected stimuli from the generated soundtrack dataset on a two-dimensional model reflecting perceived valence and arousal. These ratings were compared to the intended emotional descriptor in the meta-data accompanying specific gameplay events. Participant responses suggested strong agreement with the affective correlates, but also a significant amount of inter-participant variability. Individual calibration of the musical feature set, or further adjustment of the musical feature set are therefore suggested as useful avenues for further work.
Original languageEnglish
Number of pages19
JournalACM Computers in Entertainment
Publication statusPublished - Dec 2016

Cite this