To Tune or Not To Tune? Zero-shot Models for Legal Case Entailment
Autor: | Ruan Chaves Rodrigues, Guilherme Moraes Rosa, Roberto de Alencar Lotufo, Rodrigo Nogueira |
---|---|
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning Computer Science - Computation and Language Computer science business.industry computer.software_genre Variety (linguistics) Logical consequence Task (project management) Domain (software engineering) Zero (linguistics) Machine Learning (cs.LG) Code (cryptography) Language model Artificial intelligence Legal case business computer Computation and Language (cs.CL) Natural language processing |
Zdroj: | ICAIL |
DOI: | 10.48550/arxiv.2202.03120 |
Popis: | There has been mounting evidence that pretrained language models fine-tuned on large and diverse supervised datasets can transfer well to a variety of out-of-domain tasks. In this work, we investigate this transfer ability to the legal domain. For that, we participated in the legal case entailment task of COLIEE 2021, in which we use such models with no adaptations to the target domain. Our submissions achieved the highest scores, surpassing the second-best submission by more than six percentage points. Our experiments confirm a counter-intuitive result in the new paradigm of pretrained language models: given limited labeled data, models with little or no adaption to the target task can be more robust to changes in the data distribution than models fine-tuned on it. Code is available at https://github.com/neuralmind-ai/coliee. |
Databáze: | OpenAIRE |
Externí odkaz: |