Can you tell me how to get past sesame street? Sentence-level pretraining beyond language modeling
Autor: | Benjamin Van Durme, Ellie Pavlick, R. Thomas McCoy, Raghavendra Pappagari, Patrick Xia, Najoung Kim, Yinghui Huang, Katherin Yu, Roma Patel, Jan Hula, Edouard Grave, Shuning Jin, Ian Tenney, Samuel R. Bowman, Berlin Chen, Alex Wang |
---|---|
Předmět: |
FOS: Computer and information sciences
Computer Science - Computation and Language Computer science Natural language understanding 02 engineering and technology computer.software_genre Task (project management) 03 medical and health sciences 0302 clinical medicine 030221 ophthalmology & optometry 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Language model Transfer of learning Computation and Language (cs.CL) computer Sentence Cognitive psychology |
Zdroj: | Scopus-Elsevier ACL (1) |
Popis: | Natural language understanding has recently seen a surge of progress with the use of sentence encoders like ELMo (Peters et al., 2018a) and BERT (Devlin et al., 2019) which are pretrained on variants of language modeling. We conduct the first large-scale systematic study of candidate pretraining tasks, comparing 19 different tasks both as alternatives and complements to language modeling. Our primary results support the use language modeling, especially when combined with pretraining on additional labeled-data tasks. However, our results are mixed across pretraining tasks and show some concerning trends: In ELMo's pretrain-then-freeze paradigm, random baselines are worryingly strong and results vary strikingly across target tasks. In addition, fine-tuning BERT on an intermediate task often negatively impacts downstream transfer. In a more positive trend, we see modest gains from multitask training, suggesting the development of more sophisticated multitask and transfer learning techniques as an avenue for further research. Comment: ACL 2019. This paper supercedes "Looking for ELMo's Friends: Sentence-Level Pretraining Beyond Language Modeling", an earlier version of this work by the same authors |
Databáze: | OpenAIRE |
Externí odkaz: |