By the same authors

From the same journal

Environmental sound monitoring using machine learning on mobile devices

Research output: Contribution to journalArticlepeer-review

Full text download(s)

Published copy (DOI)



Publication details

JournalApplied Acoustics
DateAccepted/In press - 15 Sep 2019
DatePublished (current) - 23 Oct 2019
Number of pages8
Pages (from-to)1-8
Original languageEnglish


This paper reports on a study to assess the feasibility of creating an intuitive environmental sound monitoring system that can be used on-location and return meaningful measurements beyond the standard LAeq. An iOS app was created using Machine Learning (ML) and Augmented Reality (AR) in conjunction with the Sennheiser AMBEO Smart Headset in order to test this. The app returns readings indicating the human, natural and mechanical sound content of the local acoustic scene, and implements four virtual sound objects which the user can place in the scene to observe their effect on the readings. Testing at various types of urban locations indicates that the app returns meaningful ratings for natural and mechanical sound, though the pattern of variation in the ratings for human sound is less clear. Adding the virtual objects largely has no significant effect aside from the car object, which significantly increases mechanical ratings. Results indicate that using ML to provide meaningful on-location sound monitoring is feasible, though the performance of the app developed could be improved given additional calibration.

Bibliographical note

© 2019 The Authors. Published by Elsevier Ltd.

    Research areas

  • acoustic, Augmented reality, machine learning, environmental acoustics, soundscape


  • XR stories

    Project: Research project (funded)Research

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations