Next Patch Prediction for Autoregressive Visual Generation

Autor: Pang, Yatian, Jin, Peng, Yang, Shuo, Lin, Bin, Zhu, Bin, Tang, Zhenyu, Chen, Liuhan, Tay, Francis E. H., Lim, Ser-Nam, Yang, Harry, Yuan, Li
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Autoregressive models, built based on the Next Token Prediction (NTP) paradigm, show great potential in developing a unified framework that integrates both language and vision tasks. In this work, we rethink the NTP for autoregressive image generation and propose a novel Next Patch Prediction (NPP) paradigm. Our key idea is to group and aggregate image tokens into patch tokens containing high information density. With patch tokens as a shorter input sequence, the autoregressive model is trained to predict the next patch, thereby significantly reducing the computational cost. We further propose a multi-scale coarse-to-fine patch grouping strategy that exploits the natural hierarchical property of image data. Experiments on a diverse range of models (100M-1.4B parameters) demonstrate that the next patch prediction paradigm could reduce the training cost to around 0.6 times while improving image generation quality by up to 1.0 FID score on the ImageNet benchmark. We highlight that our method retains the original autoregressive model architecture without introducing additional trainable parameters or specifically designing a custom image tokenizer, thus ensuring flexibility and seamless adaptation to various autoregressive models for visual generation.
Comment: Code: https://github.com/PKU-YuanGroup/Next-Patch-Prediction
Databáze: arXiv