Abstract
In this paper we present two experiments on implementing interaction in sonification displays: the first focuses on recorded data (interactive navigation) and the second on data gathered in real time (auditory feedback).
Complex synthesised data are explored in the first experiment to evaluate how well the known characteristics present in the data are distinguished using different interaction methods, while real medical data (from physiotherapy) are used for the second.
The addition of interaction to the exploration of sonified recorded data improves the system usability (efficiency, effectiveness and user satisfaction), and the real-time sonification of complex physiotherapy data can produce sounds with timbral characteristics that audibly change when important characteristics present in the data vary.
Complex synthesised data are explored in the first experiment to evaluate how well the known characteristics present in the data are distinguished using different interaction methods, while real medical data (from physiotherapy) are used for the second.
The addition of interaction to the exploration of sonified recorded data improves the system usability (efficiency, effectiveness and user satisfaction), and the real-time sonification of complex physiotherapy data can produce sounds with timbral characteristics that audibly change when important characteristics present in the data vary.
Original language | English |
---|---|
Pages (from-to) | 923-933 |
Number of pages | 12 |
Journal | International Journal of Human-Computer Studies |
Volume | 67 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 2009 |
Keywords
- sonification, interaction, auditory feedback