The Go Transformer: Natural Language Modeling for Game Play
Autor: | David Noever, Josh Kalin, Matthew Ciolino |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning 0209 industrial biotechnology Computer Science - Computation and Language business.industry Computer science Deep learning ComputingMilieux_PERSONALCOMPUTING 02 engineering and technology Machine Learning (cs.LG) Visualization 020901 industrial engineering & automation Human–computer interaction 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Language model Artificial intelligence Championship business Computation and Language (cs.CL) Natural language Transformer (machine learning model) |
Zdroj: | AI4I |
DOI: | 10.1109/ai4i49448.2020.00012 |
Popis: | This work applies natural language modeling to generate plausible strategic moves in the ancient game of Go. We train the Generative Pretrained Transformer (GPT-2) to mimic the style of Go champions as archived in Smart Game Format (SGF), which offers a text description of move sequences. The trained model further generates valid but previously unseen strategies for Go. Because GPT-2 preserves punctuation and spacing, the raw output of the text generator provides inputs to game visualization and creative patterns, such as the Sabaki project's game engine using auto-replays. Results demonstrate that language modeling can capture both the sequencing format of championship Go games and their strategic formations. Compared to random game boards, the GPT-2 fine-tuning shows efficient opening move sequences favoring corner play over less advantageous center and side play. Game generation as a language modeling task offers novel approaches to more than 40 other board games where historical text annotation provides training data (e.g., Amazons & Connect 4/6). 8 Pages, 5 Figures, 1 Table, IEEE Format, Ai4i 2020 |
Databáze: | OpenAIRE |
Externí odkaz: |