The Go Transformer: Natural Language Modeling for Game Play

Autor: David Noever, Josh Kalin, Matthew Ciolino
Rok vydání: 2020
Předmět:
Zdroj: AI4I
DOI: 10.1109/ai4i49448.2020.00012
Popis: This work applies natural language modeling to generate plausible strategic moves in the ancient game of Go. We train the Generative Pretrained Transformer (GPT-2) to mimic the style of Go champions as archived in Smart Game Format (SGF), which offers a text description of move sequences. The trained model further generates valid but previously unseen strategies for Go. Because GPT-2 preserves punctuation and spacing, the raw output of the text generator provides inputs to game visualization and creative patterns, such as the Sabaki project's game engine using auto-replays. Results demonstrate that language modeling can capture both the sequencing format of championship Go games and their strategic formations. Compared to random game boards, the GPT-2 fine-tuning shows efficient opening move sequences favoring corner play over less advantageous center and side play. Game generation as a language modeling task offers novel approaches to more than 40 other board games where historical text annotation provides training data (e.g., Amazons & Connect 4/6).
8 Pages, 5 Figures, 1 Table, IEEE Format, Ai4i 2020
Databáze: OpenAIRE