What Makes Pre-trained Language Models Better Zero-shot Learners?

Autor: Lu, Jinghui, Zhu, Dongsheng, Han, Weidong, Zhao, Rui, Mac Namee, Brian, Tan, Fei
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
Popis: Current methods for prompt learning in zeroshot scenarios widely rely on a development set with sufficient human-annotated data to select the best-performing prompt template a posteriori. This is not ideal because in a realworld zero-shot scenario of practical relevance, no labelled data is available. Thus, we propose a simple yet effective method for screening reasonable prompt templates in zero-shot text classification: Perplexity Selection (Perplection). We hypothesize that language discrepancy can be used to measure the efficacy of prompt templates, and thereby develop a substantiated perplexity-based scheme allowing for forecasting the performance of prompt templates in advance. Experiments show that our method leads to improved prediction performance in a realistic zero-shot setting, eliminating the need for any labelled examples.
Comment: Accepted to ACL2023 main conference
Databáze: arXiv