Learning to Learn to Disambiguate: Meta-Learning for Few-Shot Word Sense Disambiguation
Autor: | Holla, N., Mishra, P., Yannakoudakis, H., Shutova, E., Cohn, T., He, Y., Liu, Y. |
---|---|
Přispěvatelé: | ILLC (FNWI), Language and Computation (ILLC, FNWI/FGw) |
Jazyk: | angličtina |
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Class (computer programming) Computer Science - Machine Learning Computer Science - Computation and Language Meta learning (computer science) business.industry Computer science Human intelligence Deep learning Contrast (statistics) 02 engineering and technology 010501 environmental sciences Machine learning computer.software_genre 01 natural sciences Task (project management) Machine Learning (cs.LG) 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Artificial intelligence business computer Computation and Language (cs.CL) 0105 earth and related environmental sciences |
Zdroj: | Findings of the Association for Computational Linguistics : Findings of ACL: EMNLP 2020: 16-20 November, 2020, 4517-4533 STARTPAGE=4517;ENDPAGE=4533;TITLE=Findings of the Association for Computational Linguistics : Findings of ACL: EMNLP 2020 EMNLP (Findings) |
Popis: | The success of deep learning methods hinges on the availability of large training datasets annotated for the task of interest. In contrast to human intelligence, these methods lack versatility and struggle to learn and adapt quickly to new tasks, where labeled data is scarce. Meta-learning aims to solve this problem by training a model on a large number of few-shot tasks, with an objective to learn new tasks quickly from a small number of examples. In this paper, we propose a meta-learning framework for few-shot word sense disambiguation (WSD), where the goal is to learn to disambiguate unseen words from only a few labeled instances. Meta-learning approaches have so far been typically tested in an $N$-way, $K$-shot classification setting where each task has $N$ classes with $K$ examples per class. Owing to its nature, WSD deviates from this controlled setup and requires the models to handle a large number of highly unbalanced classes. We extend several popular meta-learning approaches to this scenario, and analyze their strengths and weaknesses in this new challenging setting. Camera-ready: Findings of EMNLP |
Databáze: | OpenAIRE |
Externí odkaz: |