Zobrazeno 1 - 9
of 9
pro vyhledávání: '"Li, Chunyuan"'
Autor:
Zheng, Huangjie, Chen, Xu, Yao, Jiangchao, Yang, Hongxia, Li, Chunyuan, Zhang, Ya, Zhang, Hao, Tsang, Ivor, Zhou, Jingren, Zhou, Mingyuan
Contrastive learning (CL) methods effectively learn data representations without label supervision, where the encoder contrasts each positive sample over multiple negative samples via a one-vs-many softmax cross-entropy loss. By leveraging large amou
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::7b099882724864e15c1d503162f91114
http://arxiv.org/abs/2105.03746
http://arxiv.org/abs/2105.03746
Generating long-range skeleton-based human actions has been a challenging problem since small deviations of one frame can cause a malformed action sequence. Most existing methods borrow ideas from video generation, which naively treat skeleton nodes/
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::6b676dce46da93e82b7d0a3834e01930
The instability in GAN training has been a long-standing problem despite remarkable research efforts. We identify that instability issues stem from difficulties of performing feature matching with mini-batch statistics, due to a fragile balance betwe
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::a6043573baaf29f415a786ab930eac55
The Straight-Through (ST) estimator is a widely used technique for back-propagating gradients through discrete random variables. However, this effective method lacks theoretical justification. In this paper, we show that ST can be interpreted as the
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2f72cca565692cde200354c3d10976c5
http://arxiv.org/abs/1910.02176
http://arxiv.org/abs/1910.02176
Collaborative filtering is widely used in modern recommender systems. Recent research shows that variational autoencoders (VAEs) yield state-of-the-art performance by integrating flexible representations from deep neural networks into latent variable
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::5345876f0dca08b63a29e4ec94e3042d
http://arxiv.org/abs/1906.04281
http://arxiv.org/abs/1906.04281
The posteriors over neural network weights are high dimensional and multimodal. Each mode typically characterizes a meaningfully different representation of the data. We develop Cyclical Stochastic Gradient MCMC (SG-MCMC) to automatically explore suc
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::199301a4fb3411702a69e565d5088515
Generative Adversarial Networks (GANs) have proven to be a powerful framework for learning to draw samples from complex distributions. However, GANs are also notoriously difficult to train, with mode collapse and oscillations a common problem. We hyp
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::76b64689432d795954429f2bfdd76ee8
http://arxiv.org/abs/1811.11083
http://arxiv.org/abs/1811.11083
Many recently trained neural networks employ large numbers of parameters to achieve good performance. One may intuitively use the number of parameters required as a rough gauge of the difficulty of a problem. But how accurate are such notions? How ma
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::aee3f8f4d23dd7bde13b7568f1be4ed1
http://arxiv.org/abs/1804.08838
http://arxiv.org/abs/1804.08838
A new form of the variational autoencoder (VAE) is proposed, based on the symmetric Kullback-Leibler divergence. It is demonstrated that learning of the resulting symmetric VAE (sVAE) has close connections to previously developed adversarial-learning
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::17694ea7311501af32e0208ea3eb5b43