Zobrazeno 1 - 10
of 998
pro vyhledávání: '"LIU ChaoQun"'
Publikováno v:
Xiehe Yixue Zazhi, Vol 12, Iss 6, Pp 999-1003 (2021)
Pediatric ambulatory surgery can minimize the separation of pediatric patients from their parents, relieve anxiety, and offer care at a reduced cost. However, children are prone to experiencing pain, nausea, vomiting, delirium, and other complication
Externí odkaz:
https://doaj.org/article/8abdcb18afb04827b8ec709db63db5da
Traffic prediction is an indispensable component of urban planning and traffic management. Achieving accurate traffic prediction hinges on the ability to capture the potential spatio-temporal relationships among road sensors. However, the majority of
Externí odkaz:
http://arxiv.org/abs/2412.14569
Autor:
Chia, Yew Ken, Cheng, Liying, Chan, Hou Pong, Liu, Chaoqun, Song, Maojia, Aljunied, Sharifah Mahani, Poria, Soujanya, Bing, Lidong
The ability to understand and answer questions over documents can be useful in many business and practical applications. However, documents often contain lengthy and diverse multimodal contents such as texts, figures, and tables, which are very time-
Externí odkaz:
http://arxiv.org/abs/2411.06176
Autor:
Liu, Chaoqun, Chao, Qin, Zhang, Wenxuan, Wu, Xiaobao, Li, Boyang, Luu, Anh Tuan, Bing, Lidong
Large Language Models (LLMs) have demonstrated remarkable performance through supervised fine-tuning or in-context learning using gold labels. However, this paradigm is limited by the availability of gold labels, while in certain scenarios, LLMs may
Externí odkaz:
http://arxiv.org/abs/2409.12425
SeaLLMs 3: Open Foundation and Chat Multilingual Large Language Models for Southeast Asian Languages
Autor:
Zhang, Wenxuan, Chan, Hou Pong, Zhao, Yiran, Aljunied, Mahani, Wang, Jianyu, Liu, Chaoqun, Deng, Yue, Hu, Zhiqiang, Xu, Weiwen, Chia, Yew Ken, Li, Xin, Bing, Lidong
Large Language Models (LLMs) have shown remarkable abilities across various tasks, yet their development has predominantly centered on high-resource languages like English and Chinese, leaving low-resource languages underserved. To address this dispa
Externí odkaz:
http://arxiv.org/abs/2407.19672
Large language models (LLMs) have demonstrated multilingual capabilities; yet, they are mostly English-centric due to the imbalanced training corpora. Existing works leverage this phenomenon to improve their multilingual performances through translat
Externí odkaz:
http://arxiv.org/abs/2403.10258
Autor:
Wu, Xiaobao, Pan, Fengjun, Nguyen, Thong, Feng, Yichao, Liu, Chaoqun, Nguyen, Cong-Duy, Luu, Anh Tuan
Hierarchical topic modeling aims to discover latent topics from a corpus and organize them into a hierarchy to understand documents with desirable semantic granularity. However, existing work struggles with producing topic hierarchies of low affinity
Externí odkaz:
http://arxiv.org/abs/2401.14113
Autor:
Nguyen, Xuan-Phi, Zhang, Wenxuan, Li, Xin, Aljunied, Mahani, Hu, Zhiqiang, Shen, Chenhui, Chia, Yew Ken, Li, Xingxuan, Wang, Jianyu, Tan, Qingyu, Cheng, Liying, Chen, Guanzheng, Deng, Yue, Yang, Sen, Liu, Chaoqun, Zhang, Hang, Bing, Lidong
Despite the remarkable achievements of large language models (LLMs) in various tasks, there remains a linguistic bias that favors high-resource languages, such as English, often at the expense of low-resource and regional languages. To address this i
Externí odkaz:
http://arxiv.org/abs/2312.00738