Emotion-recognition using smart watch accelerometer data: preliminary findings

Juan C. Quiroz, Min Hooi Yong, Elena Geangu

Research output: Contribution to conferencePaperpeer-review

Abstract

This study investigates the use of accelerometer data
from a smart watch to infer an individual’s emotional
state. We present our preliminary findings on a user
study with 50 participants. Participants were primed
either with audio-visual (movie clips) or audio (classical
music) to elicit emotional responses. Participants then
walked while wearing a smart watch on one wrist and a heart rate strap on their chest. Our hypothesis is that the accelerometer signal will exhibit different patterns for participants in response to different emotion priming. We divided the accelerometer data using sliding windows, extracted features from each window, and used the features to train supervised machine learning algorithms to infer an individual’s emotion from their walking pattern. Our discussion includes a description of the methodology, data collected, and early results.
Original languageEnglish
Publication statusPublished - 2017
EventProceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers - Maui, United States
Duration: 11 Sept 201715 Oct 2017

Conference

ConferenceProceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers
Country/TerritoryUnited States
Period11/09/1715/10/17

Cite this