Different Tastes of Entities: Investigating Human Label Variation in Named Entity Annotations
Autor: | Peng, Siyao, Sun, Zihang, Loftus, Sebastian, Plank, Barbara |
---|---|
Rok vydání: | 2024 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Named Entity Recognition (NER) is a key information extraction task with a long-standing tradition. While recent studies address and aim to correct annotation errors via re-labeling efforts, little is known about the sources of human label variation, such as text ambiguity, annotation error, or guideline divergence. This is especially the case for high-quality datasets and beyond English CoNLL03. This paper studies disagreements in expert-annotated named entity datasets for three languages: English, Danish, and Bavarian. We show that text ambiguity and artificial guideline changes are dominant factors for diverse annotations among high-quality revisions. We survey student annotations on a subset of difficult entities and substantiate the feasibility and necessity of manifold annotations for understanding named entity ambiguities from a distributional perspective. Comment: 9 pages; Accepted at UnImplicit workshop at EACL 2024 |
Databáze: | arXiv |
Externí odkaz: |