Generative Models as Distributions of Functions
Autor: | Dupont, E, Teh, YW, Doucet, A |
---|---|
Rok vydání: | 2021 |
Předmět: | |
DOI: | 10.48550/arxiv.2102.04776 |
Popis: | Generative models are typically trained on grid-like data such as images. As a result, the size of these models usually scales directly with the underlying grid resolution. In this paper, we abandon discretized grids and instead parameterize individual data points by continuous functions. We then build generative models by learning distributions over such functions. By treating data points as functions, we can abstract away from the specific type of data we train on and construct models that are agnostic to discretization. To train our model, we use an adversarial approach with a discriminator that acts on continuous signals. Through experiments on a wide variety of data modalities including images, 3D shapes and climate data, we demonstrate that our model can learn rich distributions of functions independently of data type and resolution. Comment: AISTATS 2022 Oral camera ready. Incorporated reviewer feedback |
Databáze: | OpenAIRE |
Externí odkaz: |