CEUR Workshop Proceedings
We describe the fourth edition of the CheckThat! Lab, part of the 2022 Conference and Labs of the Evaluation Forum (CLEF). The lab evaluates technology supporting three tasks related to factuality, and it covers seven languages such as Arabic, Bulgarian, Dutch, English, German, Spanish, and Turkish. Here, we present the task 2, which asks to detect previously fact-checked claims (in two languages). A total of six teams participated in this task, submitted a total of 37 runs, and most submissions managed to achieve sizable improvements over the baselines using transformer based models such as BERT, RoBERTa. In this paper, we describe the process of data collection and the task setup, including the evaluation measures, and we give a brief overview of the participating systems. Last but not least, we release to the research community all datasets from the lab as well as the evaluation scripts, which should enable further research in detecting previously fact-checked claims. © 2022 Copyright for this paper by its authors.
Check-Worthiness Estimation, Computational Journalism, COVID-19, Detecting Previously Fact-Checked Claims, Fact-Checking, Social Media Verification, Veracity, Verified Claims Retrieval
P. Nakov, G. De San Martino, F. Alam, S. Shaar, H. Mubarak, and N. Babulkov, "Overview of the CLEF-2022 CheckThat! Lab Task 2 on Detecting Previously Fact-Checked Claims", in 2022 Conference and Labs of the Evaluation Forum, Bologna, Sept 2022, pp. 393-403, available online: http://ceur-ws.org/Vol-3180/paper-29.pdf