Zobrazeno 1 - 10
of 72
pro vyhledávání: '"Vasudevan, Srinivas"'
Autor:
Song, Xingyou, Zhang, Qiuyi, Lee, Chansoo, Fertig, Emily, Huang, Tzu-Kuo, Belenki, Lior, Kochanski, Greg, Ariafar, Setareh, Vasudevan, Srinivas, Perel, Sagi, Golovin, Daniel
Google Vizier has performed millions of optimizations and accelerated numerous research and production systems at Google, demonstrating the success of Bayesian optimization as a large-scale service. Over multiple years, its algorithm has been improve
Externí odkaz:
http://arxiv.org/abs/2408.11527
We show how rational function approximations to the logarithm, such as $\log z \approx (z^2 - 1)/(z^2 + 6z + 1)$, can be turned into fast algorithms for approximating the determinant of a very large matrix. We empirically demonstrate that when combin
Externí odkaz:
http://arxiv.org/abs/2405.03474
Autor:
Nijkamp, Erik, Gao, Ruiqi, Sountsov, Pavel, Vasudevan, Srinivas, Pang, Bo, Zhu, Song-Chun, Wu, Ying Nian
Learning energy-based model (EBM) requires MCMC sampling of the learned model as an inner loop of the learning algorithm. However, MCMC sampling of EBMs in high-dimensional data space is generally not mixing, because the energy function, which is usu
Externí odkaz:
http://arxiv.org/abs/2006.06897
Constant-memory algorithms, also loosely called Markov chains, power the vast majority of probabilistic inference and machine learning applications today. A lot of progress has been made in constructing user-friendly APIs around these algorithms. Suc
Externí odkaz:
http://arxiv.org/abs/2001.05035
Autor:
Hoffman, Matthew, Sountsov, Pavel, Dillon, Joshua V., Langmore, Ian, Tran, Dustin, Vasudevan, Srinivas
Hamiltonian Monte Carlo is a powerful algorithm for sampling from difficult-to-normalize posterior distributions. However, when the geometry of the posterior is unfavorable, it may take many expensive evaluations of the target distribution and its gr
Externí odkaz:
http://arxiv.org/abs/1903.03704
Autor:
Tran, Dustin, Hoffman, Matthew, Moore, Dave, Suter, Christopher, Vasudevan, Srinivas, Radul, Alexey, Johnson, Matthew, Saurous, Rif A.
We describe a simple, low-level approach for embedding probabilistic programming in a deep learning ecosystem. In particular, we distill probabilistic programming down to a single abstraction---the random variable. Our lightweight implementation in T
Externí odkaz:
http://arxiv.org/abs/1811.02091
Autor:
Dillon, Joshua V., Langmore, Ian, Tran, Dustin, Brevdo, Eugene, Vasudevan, Srinivas, Moore, Dave, Patton, Brian, Alemi, Alex, Hoffman, Matt, Saurous, Rif A.
The TensorFlow Distributions library implements a vision of probability theory adapted to the modern deep-learning paradigm of end-to-end differentiable computation. Building on two basic abstractions, it offers flexible building blocks for probabili
Externí odkaz:
http://arxiv.org/abs/1711.10604
We introduce a new obstruction to lifting smooth proper varieties from characteristic p > 0 to characteristic 0. It is based on Grothendieck's specialization homomorphism and the resulting discrete finiteness properties of etale fundamental groups.
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::d4c9c78a312fc5232454fd625a9f2666
https://doi.org/10.14231/ag-2023-011
https://doi.org/10.14231/ag-2023-011
Let $p$ be a prime number, and let $k$ be an algebraically closed field of characteristic $p$. We show that the tame fundamental group of a smooth affine curve over $k$ is a projective profinite group. We prove that the fundamental group of a smooth
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::6a47e931661d20b753cb95c33cd84486
Autor:
Hélène Esnault, Vasudevan Srinivas
Publikováno v:
International Mathematics Research Notices. 2019:5635-5648
We prove that the vanishing of the functoriality morphism for the étale fundamental group between smooth projective varieties over an algebraically closed field of characteristic $p>0$ forces the same property for the fundamental groups of stratific