Real-time dynamic image-source implementation for auralisation

André Oliveira, Guilherme Campos, Paulo Dias, Damian Thomas Murphy, José Viera, Catarina Mendonça, Jorge Santos

Research output: Chapter in Book/Report/Conference proceedingConference contribution


This paper describes a software package for auralisation in inter- active virtual reality environments. Its purpose is to reproduce, in real time, the 3D soundfield within a virtual room where listener and sound sources can be moved freely. Output sound is presented binaurally using headphones. Auralisation is based on geometric acoustic models combined with head-related transfer functions (HRTFs): the direct sound and reflections from each source are computed dynamically by the image-source method. Directional cues are obtained by filtering these incoming sounds by the HRTFs corresponding to their propagation directions relative to the listener, computed on the basis of the information provided by a head-tracking device. Two interactive real-time applications were developed to demonstrate the operation of this software package. Both provide a visual representation of listener (position and head orientation) and sources (including image sources). One focusses on the auralisation-visualisation synchrony and the other on the dynamic calculation of reflection paths. Computational performance results of the auralisation system are presented.
Original languageEnglish
Title of host publicationProceedings of the 16th International Conference on Digital Audio Effects
Place of PublicationMaynooth
Number of pages5
Publication statusPublished - 2 Sept 2013
Event16th International Conference on Digital Audio Effects - Maynooth, Ireland
Duration: 2 Sept 20135 Sept 2013


Conference16th International Conference on Digital Audio Effects
Abbreviated titleDAFx13

Cite this