Real-time dynamic image-source implementation for auralisation

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

  • André Oliveira
  • Guilherme Campos
  • Paulo Dias
  • Damian Thomas Murphy
  • José Viera
  • Catarina Mendonça
  • Jorge Santos

Department/unit(s)

Publication details

Title of host publicationProceedings of the 16th International Conference on Digital Audio Effects
DatePublished - 2 Sep 2013
Pages368-372
Number of pages5
Place of PublicationMaynooth
Original languageEnglish

Abstract

This paper describes a software package for auralisation in inter- active virtual reality environments. Its purpose is to reproduce, in real time, the 3D soundfield within a virtual room where listener and sound sources can be moved freely. Output sound is presented binaurally using headphones. Auralisation is based on geometric acoustic models combined with head-related transfer functions (HRTFs): the direct sound and reflections from each source are computed dynamically by the image-source method. Directional cues are obtained by filtering these incoming sounds by the HRTFs corresponding to their propagation directions relative to the listener, computed on the basis of the information provided by a head-tracking device. Two interactive real-time applications were developed to demonstrate the operation of this software package. Both provide a visual representation of listener (position and head orientation) and sources (including image sources). One focusses on the auralisation-visualisation synchrony and the other on the dynamic calculation of reflection paths. Computational performance results of the auralisation system are presented.

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations