Adaptive Divergence for Rapid Adversarial Optimization

Autor: Borisyak, Maxim, Gaintseva, Tatiana, Ustyuzhanin, Andrey
Rok vydání: 2019
Předmět:
Zdroj: PeerJ Computer Science. 2020 May;6:e274
Druh dokumentu: Working Paper
DOI: 10.7717/peerj-cs.274
Popis: Adversarial Optimization (AO) provides a reliable, practical way to match two implicitly defined distributions, one of which is usually represented by a sample of real data, and the other is defined by a generator. Typically, AO involves training of a high-capacity model on each step of the optimization. In this work, we consider computationally heavy generators, for which training of high-capacity models is associated with substantial computational costs. To address this problem, we introduce a novel family of divergences, which varies the capacity of the underlying model, and allows for a significant acceleration with respect to the number of samples drawn from the generator. We demonstrate the performance of the proposed divergences on several tasks, including tuning parameters of a physics simulator, namely, Pythia event generator.
Databáze: arXiv