Document Type

Conference Proceeding

Publication Title

CEUR Workshop Proceedings

Abstract

We describe the fourth edition of the CheckThat! Lab, part of the 2022 Conference and Labs of the Evaluation Forum (CLEF). The lab evaluates technology supporting three tasks related to factuality, and it covers seven languages such as Arabic, Bulgarian, Dutch, English, German, Spanish, and Turkish. Here, we present the task 2, which asks to detect previously fact-checked claims (in two languages). A total of six teams participated in this task, submitted a total of 37 runs, and most submissions managed to achieve sizable improvements over the baselines using transformer based models such as BERT, RoBERTa. In this paper, we describe the process of data collection and the task setup, including the evaluation measures, and we give a brief overview of the participating systems. Last but not least, we release to the research community all datasets from the lab as well as the evaluation scripts, which should enable further research in detecting previously fact-checked claims. © 2022 Copyright for this paper by its authors.

First Page

393

Last Page

403

Publication Date

9-2022

Keywords

Check-Worthiness Estimation, Computational Journalism, COVID-19, Detecting Previously Fact-Checked Claims, Fact-Checking, Social Media Verification, Veracity, Verified Claims Retrieval

Comments

Archived with thanks to CEUR Workshop Proceedings

License: CC by 4.0

Uploaded 14 September 2022

Share

COinS