Zobrazeno 1 - 10
of 40
pro vyhledávání: '"Meng, Hengyu"'
Autor:
Shao, Zhijing, Wang, Duotun, Tian, Qing-Yao, Yang, Yao-Dong, Meng, Hengyu, Cai, Zeyu, Dong, Bo, Zhang, Yu, Zhang, Kang, Wang, Zeyu
Although neural rendering has made significant advancements in creating lifelike, animatable full-body and head avatars, incorporating detailed expressions into full-body avatars remains largely unexplored. We present DEGAS, the first 3D Gaussian Spl
Externí odkaz:
http://arxiv.org/abs/2408.10588
Autor:
Wang, Duotun, Meng, Hengyu, Cai, Zeyu, Shao, Zhijing, Liu, Qianxi, Wang, Lin, Fan, Mingming, Zhan, Xiaohang, Wang, Zeyu
Current text-to-avatar methods often rely on implicit representations (e.g., NeRF, SDF, and DMTet), leading to 3D content that artists cannot easily edit and animate in graphics software. This paper introduces a novel framework for generating stylize
Externí odkaz:
http://arxiv.org/abs/2403.09326
Visual storytelling often uses nontypical aspect-ratio images like scroll paintings, comic strips, and panoramas to create an expressive and compelling narrative. While generative AI has achieved great success and shown the potential to reshape the c
Externí odkaz:
http://arxiv.org/abs/2312.10899
Large language models (LLMs) have demonstrated remarkable performance and tremendous potential across a wide range of tasks. However, deploying these models has been challenging due to the astronomical amount of model parameters, which requires a dem
Externí odkaz:
http://arxiv.org/abs/2311.00502
Autor:
Shen, Haihao, Meng, Hengyu, Dong, Bo, Wang, Zhe, Zafrir, Ofir, Ding, Yi, Luo, Yu, Chang, Hanwen, Gao, Qun, Wang, Ziheng, Boudoukh, Guy, Wasserblat, Moshe
In recent years, Transformer-based language models have become the standard approach for natural language processing tasks. However, stringent throughput and latency requirements in industrial applications are limiting their adoption. To mitigate the
Externí odkaz:
http://arxiv.org/abs/2306.16601
Autor:
Shen, Haihao, Zafrir, Ofir, Dong, Bo, Meng, Hengyu, Ye, Xinyu, Wang, Zhe, Ding, Yi, Chang, Hanwen, Boudoukh, Guy, Wasserblat, Moshe
Transformer-based language models have become the standard approach to solving natural language processing tasks. However, industry adoption usually requires the maximum throughput to comply with certain latency constraints that prevents Transformer
Externí odkaz:
http://arxiv.org/abs/2211.07715
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Duan, Yuqing, Guo, Dingjie, Zhang, Xin, Lan, Linwei, Meng, Hengyu, Wang, Yashan, Sui, Chuanying, Qu, Zihan, He, Guangliang, Wang, Chunpeng, Liu, Xin
Publikováno v:
In Photodiagnosis and Photodynamic Therapy September 2023 43
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Qu, Zihan, Wang, Yashan, Guo, Dingjie, He, Guangliang, Sui, Chuanying, Duan, Yuqing, Zhang, Xin, Meng, Hengyu, Lan, Linwei, Liu, Xin
Publikováno v:
Journal of Gastroenterology & Hepatology; Sep2024, Vol. 39 Issue 9, p1816-1826, 11p