Files
Abstract
The rapid advancement of technologies enabling autonomous ship capabilities has outpaced the development of corresponding legislative and regulatory frameworks, creating a bottleneck in the global application of autonomous ships as defined by the four MASS degrees of autonomy.
A significant issue is the absence of a well-defined process for certifying new algorithms and systems to be installed on board. Recently, a reliable, open-access structured set of scenarios, including several challenging COLREG encounter situations, has been published, intended for testing the numerous path-planning algorithms developed over the years.
When connected to the testing framework, the modules must accomplish two primary tasks: determining the applicable COLREG rules (COLREG classification) and computing an evasive route.
This paper focuses on defining an approach to evaluate the performance of collision avoidance algorithms through dedicated metrics.
These metrics are formulated to quantitatively compare escape manoeuvres according to relevant performance indexes, helping a human-based compliance evaluation assess COLREG adherence.
The operation of the comparison metrics is demonstrated by testing an existing collision avoidance algorithm developed by the authors.
This demonstration underscores the effectiveness of the analyzed algorithms in efficiently managing the challenging scenarios proposed in the literature.
Finally, the paper provides valuable suggestions for modifying and improving the testing scenarios to enhance the robustness of the comparison metrics.