Zobrazeno 1 - 10
of 41
pro vyhledávání: '"Gu, Yihong"'
Pursuing causality from data is a fundamental problem in scientific discovery, treatment intervention, and transfer learning. This paper introduces a novel algorithmic method for addressing nonparametric invariance and causality learning in regressio
Externí odkaz:
http://arxiv.org/abs/2405.04715
It is observed empirically that the large language models (LLM), trained with a variant of regression loss using numerous corpus from the Internet, can unveil causal associations to some extent. This is contrary to the traditional wisdom that ``assoc
Externí odkaz:
http://arxiv.org/abs/2403.01420
This paper considers a multiple environments linear regression model in which data from multiple experimental settings are collected. The joint distribution of the response variable and covariate may vary across different environments, yet the condit
Externí odkaz:
http://arxiv.org/abs/2303.03092
Autor:
Fan, Jianqing, Gu, Yihong
This paper introduces a Factor Augmented Sparse Throughput (FAST) model that utilizes both latent factors and sparse idiosyncratic components for nonparametric regression. The FAST model bridges factor models on one end and sparse nonparametric model
Externí odkaz:
http://arxiv.org/abs/2210.02002
This paper investigates the stability of deep ReLU neural networks for nonparametric regression under the assumption that the noise has only a finite p-th moment. We unveil how the optimal rate of convergence depends on p, the degree of smoothness an
Externí odkaz:
http://arxiv.org/abs/2203.10418
Analysis of over-parameterized neural networks has drawn significant attention in recentyears. It was shown that such systems behave like convex systems under various restrictedsettings, such as for two-level neural networks, and when learning is onl
Externí odkaz:
http://arxiv.org/abs/1911.07626
We study the question of how to imitate tasks across domains with discrepancies such as embodiment, viewpoint, and dynamics mismatch. Many prior works require paired, aligned demonstrations and an additional RL step that requires environment interact
Externí odkaz:
http://arxiv.org/abs/1910.00105
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Gu, Yihong, Yan, Jun, Zhu, Hao, Liu, Zhiyuan, Xie, Ruobing, Sun, Maosong, Lin, Fen, Lin, Leyu
Most language modeling methods rely on large-scale data to statistically learn the sequential patterns of words. In this paper, we argue that words are atomic language units but not necessarily atomic semantic units. Inspired by HowNet, we use sememe
Externí odkaz:
http://arxiv.org/abs/1810.12387
In this paper we introduce ZhuSuan, a python probabilistic programming library for Bayesian deep learning, which conjoins the complimentary advantages of Bayesian methods and deep learning. ZhuSuan is built upon Tensorflow. Unlike existing deep learn
Externí odkaz:
http://arxiv.org/abs/1709.05870