Zobrazeno 1 - 10
of 1 311
pro vyhledávání: '"MacReady A"'
The past decade has amply demonstrated the remarkable functionality that can be realized by learning complex input/output relationships. Algorithmically, one of the most important and opaque relationships is that between a problem's structure and an
Externí odkaz:
http://arxiv.org/abs/2207.14422
Autor:
MACREADY, HANNAH
Publikováno v:
Spotlight - Einfach Besser Englisch. 2024, Issue 7, p38-45. 8p.
Autor:
MACREADY, HANNAH
Publikováno v:
Spotlight - Einfach Besser Englisch. 2024, Issue 3, p22-25. 4p.
Domain shift is unavoidable in real-world applications of object detection. For example, in self-driving cars, the target domain consists of unconstrained road environments which cannot all possibly be observed in training data. Similarly, in surveil
Externí odkaz:
http://arxiv.org/abs/1904.02361
The representation of the approximate posterior is a critical aspect of effective variational autoencoders (VAEs). Poor choices for the approximate posterior have a detrimental impact on the generative performance of VAEs due to the mismatch with the
Externí odkaz:
http://arxiv.org/abs/1901.03440
Building a large image dataset with high-quality object masks for semantic segmentation is costly and time consuming. In this paper, we introduce a principled semi-supervised framework that only uses a small set of fully supervised images (having sem
Externí odkaz:
http://arxiv.org/abs/1811.07073
Autor:
Bian, Zhengbing, Chudak, Fabian, Macready, William, Roy, Aidan, Sebastiani, Roberto, Varotti, Stefano
Quantum annealers (QAs) are specialized quantum computers that minimize objective functions over discrete variables by physically exploiting quantum effects. Current QA platforms allow for the optimization of quadratic objectives defined over binary
Externí odkaz:
http://arxiv.org/abs/1811.02524
In many applications we seek to maximize an expectation with respect to a distribution over discrete variables. Estimating gradients of such objectives with respect to the distribution parameters is a challenging problem. We analyze existing solution
Externí odkaz:
http://arxiv.org/abs/1810.00116
Boltzmann machines are powerful distributions that have been shown to be an effective prior over binary latent variables in variational autoencoders (VAEs). However, previous methods for training discrete VAEs have used the evidence lower bound and n
Externí odkaz:
http://arxiv.org/abs/1805.07445
Training of discrete latent variable models remains challenging because passing gradient information through discrete units is difficult. We propose a new class of smoothing transformations based on a mixture of two overlapping distributions, and sho
Externí odkaz:
http://arxiv.org/abs/1802.04920