Pulling back information geometry
Autor: | Georgios Arvanitidis, Miguel González-Duque, Alison Pouplin, Dimitrios Kalatzis, Søren Hauberg |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | Arvanitidis, G, González-Duque, M, Pouplin, A, Kalatzis, D & Hauberg, S 2022, Pulling back information geometry . in Proceedings of the 25 th International Conference on Artificial Intelligence and Statistics . Proceedings of Machine Learning Research, vol. 151, 25 th International Conference on Artificial Intelligence and Statistics, 28/03/2022 . Technical University of Denmark Orbit |
Popis: | Latent space geometry has shown itself to provide a rich and rigorous framework for interacting with the latent variables of deep generative models. The existing theory, however, relies on the decoder being a Gaussian distribution as its simple reparametrization allows us to interpret the generating process as a random projection of a deterministic manifold. Consequently, this approach breaks down when applied to decoders that are not as easily reparametrized. We here propose to use the Fisher-Rao metric associated with the space of decoder distributions as a reference metric, which we pull back to the latent space. We show that we can achieve meaningful latent geometries for a wide range of decoder distributions for which the previous theory was not applicable, opening the door to `black box' latent geometries. Presented at AISTATS 2022 |
Databáze: | OpenAIRE |
Externí odkaz: |