To Tune or Not To Tune? Zero-shot Models for Legal Case Entailment

Autor: Ruan Chaves Rodrigues, Guilherme Moraes Rosa, Roberto de Alencar Lotufo, Rodrigo Nogueira
Rok vydání: 2022
Předmět:
Zdroj: ICAIL
DOI: 10.48550/arxiv.2202.03120
Popis: There has been mounting evidence that pretrained language models fine-tuned on large and diverse supervised datasets can transfer well to a variety of out-of-domain tasks. In this work, we investigate this transfer ability to the legal domain. For that, we participated in the legal case entailment task of COLIEE 2021, in which we use such models with no adaptations to the target domain. Our submissions achieved the highest scores, surpassing the second-best submission by more than six percentage points. Our experiments confirm a counter-intuitive result in the new paradigm of pretrained language models: given limited labeled data, models with little or no adaption to the target task can be more robust to changes in the data distribution than models fine-tuned on it. Code is available at https://github.com/neuralmind-ai/coliee.
Databáze: OpenAIRE