Enhancing Lifelong Language Learning by Improving Pseudo-Sample Generation

Autor: Kasidis Kanwatchara, Thanapapas Horsuwan, Piyawat Lertvittayakumjorn, Boonserm Kijsirikul, Peerapon Vateekul
Jazyk: angličtina
Rok vydání: 2022
Předmět:
Zdroj: Computational Linguistics, Vol 48, Iss 4 (2022)
Druh dokumentu: article
ISSN: 1530-9312
DOI: 10.1162/coli_a_00449
Popis: To achieve lifelong language learning, pseudo-rehearsal methods leverage samples generated from a language model to refresh the knowledge of previously learned tasks. Without proper controls, however, these methods could fail to retain the knowledge of complex tasks with longer texts since most of the generated samples are low in quality. To overcome the problem, we propose three specific contributions. First, we utilize double language models, each of which specializes in a specific part of the input, to produce high-quality pseudo samples. Second, we reduce the number of parameters used by applying adapter modules to enhance training efficiency. Third, we further improve the overall quality of pseudo samples using temporal ensembling and sample regeneration. The results show that our framework achieves significant improvement over baselines on multiple task sequences. Also, our pseudo sample analysis reveals helpful insights for designing even better pseudo-rehearsal methods in the future.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje