CrowdTruth 2.0: Quality Metrics for Crowdsourcing with Disagreement

Autor: Dumitrache, Anca, Inel, Oana, Aroyo, Lora, Timmermans, Benjamin, Welty, Chris
Rok vydání: 2018
Předmět:
Druh dokumentu: Working Paper
Popis: Typically crowdsourcing-based approaches to gather annotated data use inter-annotator agreement as a measure of quality. However, in many domains, there is ambiguity in the data, as well as a multitude of perspectives of the information examples. In this paper, we present ongoing work into the CrowdTruth metrics, that capture and interpret inter-annotator disagreement in crowdsourcing. The CrowdTruth metrics model the inter-dependency between the three main components of a crowdsourcing system -- worker, input data, and annotation. The goal of the metrics is to capture the degree of ambiguity in each of these three components. The metrics are available online at https://github.com/CrowdTruth/CrowdTruth-core .
Databáze: arXiv