Download PDFOpen PDF in browserARCH-COMP23 Repeatability Evaluation Report7 pages•Published: October 18, 2023AbstractThe repeatability evaluation for the 7th International Competition on Verifying Continuous and Hybrid Systems (ARCH-COMP’23) is summarized in this report. The competition took place as part of the workshop Applied Verification for Continuous and Hybrid Systems (ARCH) in 2023, affiliated with the 2023 Cyber-Physical Systems and Internet- of-Things Week (CPS-IoT Week). In its seventh edition, tools submitted artifacts through a new automated evaluation system and were synchronized with a Git repository for the repeatability evaluation and archiving, which were applied to solve benchmark instances through different competition categories. Due to procedural changes in execution through the automated system, fewer participants than in past iterations participated in the repeatability evaluation this year. The process was generally to submit scripts to automatically install and execute the tools in containerized virtual environments (specifically Dockerfiles to execute within Docker containers, along with execution scripts). With the automated evaluation system, most participating categories presented performance evaluation information from this common execution platform.Keyphrases: artifact evaluation, formal methods, hybrid systems, repeatability evaluation, verification In: Goran Frehse and Matthias Althoff (editors). Proceedings of 10th International Workshop on Applied Verification of Continuous and Hybrid Systems (ARCH23), vol 96, pages 189-195.
|