Towards universal neural nets: Gibbs machines and ACE
Autor: | Georgiev, Galin |
---|---|
Rok vydání: | 2015 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | We study from a physics viewpoint a class of generative neural nets, Gibbs machines, designed for gradual learning. While including variational auto-encoders, they offer a broader universal platform for incrementally adding newly learned features, including physical symmetries. Their direct connection to statistical physics and information geometry is established. A variational Pythagorean theorem justifies invoking the exponential/Gibbs class of probabilities for creating brand new objects. Combining these nets with classifiers, gives rise to a brand of universal generative neural nets - stochastic auto-classifier-encoders (ACE). ACE have state-of-the-art performance in their class, both for classification and density estimation for the MNIST data set. Comment: v5: added thermodynamic identities and variational error estimation; expanded references |
Databáze: | arXiv |
Externí odkaz: |