Abstract
Autonomous drones have been proposed for many industrial inspection roles including wind farms, railway lines and solar farms. They have many potential benefits, including accessing difficult to reach locations, reduced physical risk to operators and improved performance. The drones must be assured to be both safe and dependable, particularly if they are used near high-value infrastructure. One key aspect is assurance of any Machine Learning (ML) components, which can be extremely challenging due to informal requirements and difficulty in verifying performance over a wide enough range of conditions. Hence, software based simulations are often used to supplement real-world testing, providing coverage of hard to test situations (e.g., extreme wind conditions or bird strike) and many variances in the external environment. However, simulations are approximate models of reality and it can be difficult to anticipate what must be modelled (and to what precision) to be confident of a sufficiently accurate prediction of real-world performance.
In this paper we present an approach to assuring the simulation environment and models used for both training and verifying the performance of an ML Flight Controller (FC) in an autonomous drone designed for offshore wind-farm inspection. Our approach considers specific aspects of modelling the drone hardware and target Operational Domain Model (ODM) as well as quality assurance of the simulation tool. We describe how real-world testing was used to improve simulation models, provide assurance in the quality of the simulation tool, and thus support a safety assurance case.
In this paper we present an approach to assuring the simulation environment and models used for both training and verifying the performance of an ML Flight Controller (FC) in an autonomous drone designed for offshore wind-farm inspection. Our approach considers specific aspects of modelling the drone hardware and target Operational Domain Model (ODM) as well as quality assurance of the simulation tool. We describe how real-world testing was used to improve simulation models, provide assurance in the quality of the simulation tool, and thus support a safety assurance case.
Original language | English |
---|---|
Publication status | Published - 2024 |
Event | Seventh International Workshop on Artificial Intelligence Safety Engineering - Florence, Italy Duration: 17 Sept 2024 → … |
Workshop
Workshop | Seventh International Workshop on Artificial Intelligence Safety Engineering |
---|---|
Abbreviated title | WAISE |
Country/Territory | Italy |
City | Florence |
Period | 17/09/24 → … |