Autor: |
Sahu, Pritish, Cogswell, Michael, Rutherford-Quach, Sara, Divakaran, Ajay |
Rok vydání: |
2021 |
Předmět: |
|
Druh dokumentu: |
Working Paper |
Popis: |
Current pre-trained language models have lots of knowledge, but a more limited ability to use that knowledge. Bloom's Taxonomy helps educators teach children how to use knowledge by categorizing comprehension skills, so we use it to analyze and improve the comprehension skills of large pre-trained language models. Our experiments focus on zero-shot question answering, using the taxonomy to provide proximal context that helps the model answer questions by being relevant to those questions. We show targeting context in this manner improves performance across 4 popular common sense question answer datasets. |
Databáze: |
arXiv |
Externí odkaz: |
|