Bayesian Learning for the Robust Verification of Autonomous Robots

Xingyu Zhao*, Simos Gerasimou*, Radu Calinescu, Calum Corrie Imrie, Valentin Robu, David Flynn

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Autonomous robots used in infrastructure inspection, space exploration and other critical missions operate in highly dynamic environments. As such, they must continually verify their ability to complete the tasks associated with these missions safely and effectively. Here we present a Bayesian learning framework that enables this runtime verification of autonomous robots. The framework uses prior knowledge and observations of the verified robot to learn expected ranges for the occurrence rates of regular and singular (e.g., catastrophic failure) events. Interval continuous-time Markov models defined using these ranges are then analysed to obtain expected intervals of variation for system properties such as mission duration and success probability. We apply the framework to an autonomous robotic mission for underwater infrastructure inspection and repair. The formal proofs and experiments presented in the paper show that our framework produces results that reflect the uncertainty intrinsic to many real-world systems, enabling the robust verification of their quantitative properties under parametric uncertainty.
Original languageEnglish
JournalCommunications Engineering
Publication statusAccepted/In press - 22 Nov 2023

Cite this