Text Diffusion with Reinforced Conditioning

Autor: Liu, Yuxuan, Yang, Tianchi, Huang, Shaohan, Zhang, Zihan, Huang, Haizhen, Wei, Furu, Deng, Weiwei, Sun, Feng, Zhang, Qi
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Diffusion models have demonstrated exceptional capability in generating high-quality images, videos, and audio. Due to their adaptiveness in iterative refinement, they provide a strong potential for achieving better non-autoregressive sequence generation. However, existing text diffusion models still fall short in their performance due to a challenge in handling the discreteness of language. This paper thoroughly analyzes text diffusion models and uncovers two significant limitations: degradation of self-conditioning during training and misalignment between training and sampling. Motivated by our findings, we propose a novel Text Diffusion model called TREC, which mitigates the degradation with Reinforced Conditioning and the misalignment by Time-Aware Variance Scaling. Our extensive experiments demonstrate the competitiveness of TREC against autoregressive, non-autoregressive, and diffusion baselines. Moreover, qualitative analysis shows its advanced ability to fully utilize the diffusion process in refining samples.
Comment: 9 pages, 3 figures
Databáze: arXiv