Projects per year
Abstract
This paper reports on a study to assess the feasibility of creating an intuitive environmental sound monitoring system that can be used on-location and return meaningful measurements beyond the standard LAeq. An iOS app was created using Machine Learning (ML) and Augmented Reality (AR) in conjunction with the Sennheiser AMBEO Smart Headset in order to test this. The app returns readings indicating the human, natural and mechanical sound content of the local acoustic scene, and implements four virtual sound objects which the user can place in the scene to observe their effect on the readings. Testing at various types of urban locations indicates that the app returns meaningful ratings for natural and mechanical sound, though the pattern of variation in the ratings for human sound is less clear. Adding the virtual objects largely has no significant effect aside from the car object, which significantly increases mechanical ratings. Results indicate that using ML to provide meaningful on-location sound monitoring is feasible, though the performance of the app developed could be improved given additional calibration.
Original language | English |
---|---|
Pages (from-to) | 1-8 |
Number of pages | 8 |
Journal | Applied Acoustics |
Volume | 159 |
DOIs | |
Publication status | Published - 23 Oct 2019 |
Bibliographical note
© 2019 The Authors. Published by Elsevier Ltd.Keywords
- acoustic
- Augmented reality
- machine learning
- environmental acoustics
- soundscape
Profiles
Projects
- 1 Finished
-
XR stories
Murphy, D. T. (Principal investigator), Higson, A. D. (Co-investigator) & Ursu, M. (Co-investigator)
8/10/18 → 31/03/24
Project: Research project (funded) › Research