Projects per year
Abstract
In stereo sound systems, the listener's ability to localise sound in the reproduced stereo field is affected by a variety of factors such as their listening position and the degree of separation between the loudspeakers. Understanding the effect of these factors is important for creating object-based mixes, where the panning position and volume of sounds can be adjusted to suit the user at the point of media consumption. Being able to control the panning and volume of different sources in the renderer is especially useful for visually impaired audience members, as it can allow for greater immersion and narrative understanding for film and television. For this reason, it is important to conduct listening tests on how these audiences perceive sound over stereo systems. These listening tests can then be used to inform renderer panning algorithms. However, before inviting visually impaired people to take part in listening tests, a robust methodology and control group comparison must first be established. Here, we present an algorithm that is driven by the results of a Minimum Audible Angle (MAA) staircase listening test for different listening positions and loudspeaker base width combinations. This consisted of 22 trials across 21 sighted participants. We calculated the mean MAA for these 22 trials and used these results to determine our panning angle when given a listener position and stereo base width. We then carried out a smaller listening test to determine the effectiveness of a perceptually informed rendering algorithm (8 tests across 8 participants). We achieved this by calculating the F-Score of our method against a control method that does not account for these factors. The mean F-score was 0.62 for our method against a mean of 0.36 for our control. Our results indicate that our algorithm has the potential to be used for object-based mixes where spatial separation for all audience members is more important than precise object placement. Future research will carry out the same experiment with visually impaired audience members, and compare the results with this pilot study.
Original language | English |
---|---|
Title of host publication | 2023 Immersive and 3D Audio: from Architecture to Automotive (I3DA) |
Place of Publication | Bologna, Italy |
Publisher | IEEE |
Number of pages | 6 |
ISBN (Electronic) | 979-8-3503-1104-4 |
ISBN (Print) | 979-8-3503-1105-1 |
DOIs | |
Publication status | Published - 23 Oct 2023 |
Event | Immersive and 3D Audio: from Architecture to Automotive (I3DA) - Bologna, Italy Duration: 5 Sept 2023 → 7 Sept 2023 |
Conference
Conference | Immersive and 3D Audio: from Architecture to Automotive (I3DA) |
---|---|
Country/Territory | Italy |
City | Bologna |
Period | 5/09/23 → 7/09/23 |
Keywords
- Stereo
- VBAP
- Accessibility
- object-based audio
- Film
Projects
- 1 Active
-
Enhancing Audio Description II: Implementing accessible, personalised and inclusive film and television experiences for visually impaired audiences
15/11/21 → 14/11/25
Project: Research project (funded) › Research