Fine-tuning GPT-2 on annotated RPG quests for NPC dialogue generation

Autor: van Stegeren, Judith, Myśliwiec, Jakub, Fowler, Allan, Pirker, Johanna, Canossa, Alesandro Alessandro, Arya, Ali Ali, Harteveld, Casper
Přispěvatelé: Digital Society Institute, Human Media Interaction
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: FDG'21: The 16th International Conference on the Foundations of Digital Games (FDG) 2021
FDG
Popis: GPT-2, a neural language model trained on a large dataset of English web text, has been used in a variety of natural language generation tasks because of the language quality and coherence of its outputs. In order to investigate the usability of GPT-2 for text generation for video games, we fine-tuned GPT-2 on a corpus of video game quests and used this model to generate dialogue lines for quest-giver NPCs in a role-playing game. We show that the model learned the structure of quests and NPC dialogue, and investigate how the temperature parameter influences the language quality and creativity of the output artifacts. We evaluated our approach with a crowdsource experiment in which human judges were asked to rate hand-written and generated quest texts on language quality, coherence and creativity.
Databáze: OpenAIRE