Zobrazeno 1 - 10
of 20
pro vyhledávání: '"Matthews, Alexander G. de G."'
Publikováno v:
Phys. Rev. Research 2, 033429 (2020)
Given access to accurate solutions of the many-electron Schr\"odinger equation, nearly all chemistry could be derived from first principles. Exact wavefunctions of interesting chemical systems are out of reach because they are NP-hard to compute in g
Externí odkaz:
http://arxiv.org/abs/1909.02487
Autor:
Titsias, Michalis K., Schwarz, Jonathan, Matthews, Alexander G. de G., Pascanu, Razvan, Teh, Yee Whye
We introduce a framework for Continual Learning (CL) based on Bayesian inference over the function space rather than the parameters of a deep neural network. This method, referred to as functional regularisation for Continual Learning, avoids forgett
Externí odkaz:
http://arxiv.org/abs/1901.11356
Dropout, a stochastic regularisation technique for training of neural networks, has recently been reinterpreted as a specific type of approximate inference algorithm for Bayesian neural networks. The main contribution of the reinterpretation is in pr
Externí odkaz:
http://arxiv.org/abs/1807.01969
Autor:
Matthews, Alexander G. de G., Rowland, Mark, Hron, Jiri, Turner, Richard E., Ghahramani, Zoubin
Whilst deep neural networks have shown great empirical success, there is still much work to be done to understand their theoretical properties. In this paper, we study the relationship between random, wide, fully connected, feedforward networks with
Externí odkaz:
http://arxiv.org/abs/1804.11271
Gaussian multiplicative noise is commonly used as a stochastic regularisation technique in training of deterministic neural networks. A recent paper reinterpreted the technique as a specific algorithm for approximate inference in Bayesian neural netw
Externí odkaz:
http://arxiv.org/abs/1711.02989
Deep neural networks (DNNs) have excellent representative power and are state of the art classifiers on many tasks. However, they often do not capture their own uncertainties well making them less robust in the real world as they overconfidently extr
Externí odkaz:
http://arxiv.org/abs/1707.02476
Autor:
Matthews, Alexander G. de G., van der Wilk, Mark, Nickson, Tom, Fujii, Keisuke, Boukouvalas, Alexis, León-Villagrá, Pablo, Ghahramani, Zoubin, Hensman, James
GPflow is a Gaussian process library that uses TensorFlow for its core computations and Python for its front end. The distinguishing features of GPflow are that it uses variational inference as the primary approximation method, provides concise code
Externí odkaz:
http://arxiv.org/abs/1610.08733
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate th
Externí odkaz:
http://arxiv.org/abs/1506.04000
The variational framework for learning inducing variables (Titsias, 2009a) has had a large impact on the Gaussian process literature. The framework may be interpreted as minimizing a rigorously defined Kullback-Leibler divergence between the approxim
Externí odkaz:
http://arxiv.org/abs/1504.07027
McCullagh and Yang (2006) suggest a family of classification algorithms based on Cox processes. We further investigate the log Gaussian variant which has a number of appealing properties. Conditioned on the covariates, the distribution over labels is
Externí odkaz:
http://arxiv.org/abs/1405.4141