Addressing Topic Leakage in Cross-Topic Evaluation for Authorship Verification

Autor: Sawatphol, Jitkapat, Udomcharoenchaikit, Can, Nutanong, Sarana
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Authorship verification (AV) aims to identify whether a pair of texts has the same author. We address the challenge of evaluating AV models' robustness against topic shifts. The conventional evaluation assumes minimal topic overlap between training and test data. However, we argue that there can still be topic leakage in test data, causing misleading model performance and unstable rankings. To address this, we propose an evaluation method called Heterogeneity-Informed Topic Sampling (HITS), which creates a smaller dataset with a heterogeneously distributed topic set. Our experimental results demonstrate that HITS-sampled datasets yield a more stable ranking of models across random seeds and evaluation splits. Our contributions include: 1. An analysis of causes and effects of topic leakage. 2. A demonstration of the HITS in reducing the effects of topic leakage, and 3. The Robust Authorship Verification bENchmark (RAVEN) that allows topic shortcut test to uncover AV models' reliance on topic-specific features.
Comment: Accepted to publish at Transactions of the Association for Computational Linguistics
Databáze: arXiv