Zobrazeno 1 - 10
of 126
pro vyhledávání: '"FAN, Jiaojiao"'
Autor:
Wang, Zhendong, Li, Zhaoshuo, Mandlekar, Ajay, Xu, Zhenjia, Fan, Jiaojiao, Narang, Yashraj, Fan, Linxi, Zhu, Yuke, Balaji, Yogesh, Zhou, Mingyuan, Liu, Ming-Yu, Zeng, Yu
Diffusion models, praised for their success in generative tasks, are increasingly being applied to robotics, demonstrating exceptional performance in behavior cloning. However, their slow generation process stemming from iterative denoising steps pos
Externí odkaz:
http://arxiv.org/abs/2410.21257
The current bottleneck in continuous sign language recognition (CSLR) research lies in the fact that most publicly available datasets are limited to laboratory environments or television program recordings, resulting in a single background environmen
Externí odkaz:
http://arxiv.org/abs/2409.11960
We study multi-marginal optimal transport (MOT) problems where the underlying cost has a graphical structure. These graphical multi-marginal optimal transport problems have found applications in several domains including traffic flow control and regr
Externí odkaz:
http://arxiv.org/abs/2406.10849
There is a rapidly growing interest in controlling consistency across multiple generated images using diffusion models. Among various methods, recent works have found that simply manipulating attention modules by concatenating features from multiple
Externí odkaz:
http://arxiv.org/abs/2405.17661
We consider the sampling problem from a composite distribution whose potential (negative log density) is $\sum_{i=1}^n f_i(x_i)+\sum_{j=1}^m g_j(y_j)+\sum_{i=1}^n\sum_{j=1}^m\frac{\sigma_{ij}}{2\eta} \Vert x_i-y_j \Vert^2_2$ where each of $x_i$ and $
Externí odkaz:
http://arxiv.org/abs/2306.13801
Autor:
Fan, Jiaojiao, Alvarez-Melis, David
Publikováno v:
Conference on Uncertainty in Artificial Intelligence (UAI) 2023
Data for pretraining machine learning models often consists of collections of heterogeneous datasets. Although training on their union is reasonable in agnostic settings, it might be suboptimal when the target domain -- where the model will ultimatel
Externí odkaz:
http://arxiv.org/abs/2306.06866
Publikováno v:
COLT 2023
We propose a sampling algorithm that achieves superior complexity bounds in all the classical settings (strongly log-concave, log-concave, Logarithmic-Sobolev inequality (LSI), Poincar\'e inequality) as well as more general settings with semi-smooth
Externí odkaz:
http://arxiv.org/abs/2302.10081
Publikováno v:
IEEE CDC 2023
We study the problem of sampling from a target distribution in $\mathbb{R}^d$ whose potential is not smooth. Compared with the sampling problem with smooth potentials, this problem is much less well-understood due to the lack of smoothness. In this p
Externí odkaz:
http://arxiv.org/abs/2208.07459
Autor:
Xu, Luchun, Yang, Yongdong, Jiang, Guozheng, Gao, Yushan, Song, Jiawei, Ma, Yukun, Fan, Jiaojiao, Wang, Guanlong, Yu, Xing, Tang, Xiangsheng
Publikováno v:
In Journal of Traditional Chinese Medical Sciences October 2024 11(4):456-465
Wasserstein gradient flow has emerged as a promising approach to solve optimization problems over the space of probability distributions. A recent trend is to use the well-known JKO scheme in combination with input convex neural networks to numerical
Externí odkaz:
http://arxiv.org/abs/2112.02424