Environmental sound monitoring using machine learning on mobile devices

Marc Ciufo Green, Damian Thomas Murphy

Research output: Contribution to journalArticlepeer-review


This paper reports on a study to assess the feasibility of creating an intuitive environmental sound monitoring system that can be used on-location and return meaningful measurements beyond the standard LAeq. An iOS app was created using Machine Learning (ML) and Augmented Reality (AR) in conjunction with the Sennheiser AMBEO Smart Headset in order to test this. The app returns readings indicating the human, natural and mechanical sound content of the local acoustic scene, and implements four virtual sound objects which the user can place in the scene to observe their effect on the readings. Testing at various types of urban locations indicates that the app returns meaningful ratings for natural and mechanical sound, though the pattern of variation in the ratings for human sound is less clear. Adding the virtual objects largely has no significant effect aside from the car object, which significantly increases mechanical ratings. Results indicate that using ML to provide meaningful on-location sound monitoring is feasible, though the performance of the app developed could be improved given additional calibration.
Original languageEnglish
Pages (from-to)1-8
Number of pages8
JournalApplied Acoustics
Publication statusPublished - 23 Oct 2019

Bibliographical note

© 2019 The Authors. Published by Elsevier Ltd.


  • acoustic
  • Augmented reality
  • machine learning
  • environmental acoustics
  • soundscape

Cite this