VerifyThis 2019: a program verification competition

Claire Dross, Carlo A. Furia*, Marieke Huisman, Rosemary Monahan, Peter Müller

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

7 Citations (Scopus)
48 Downloads (Pure)


VerifyThis is a series of program verification competitions that emphasize the human aspect: participants tackle the verification of detailed behavioral properties—something that lies beyond the capabilities of fully automatic verification and requires instead human expertise to suitably encode programs, specifications, and invariants. This paper describes the 8th edition of VerifyThis, which took place at ETAPS 2019 in Prague. Thirteen teams entered the competition, which consisted of three verification challenges and spanned 2 days of work. This report analyzes how the participating teams fared on these challenges, reflects on what makes a verification challenge more or less suitable for the typical VerifyThis participants, and outlines the difficulties of comparing the work of teams using wildly different verification approaches in a competition focused on the human aspect.

Original languageEnglish
Pages (from-to)883-893
Number of pages11
JournalInternational journal on software tools for technology transfer
Issue number6
Early online date19 May 2021
Publication statusPublished - Dec 2021


  • Correctness proof
  • Functional correctness
  • Program verification
  • Verification competition


Dive into the research topics of 'VerifyThis 2019: a program verification competition'. Together they form a unique fingerprint.

Cite this