Zobrazeno 1 - 10
of 36
pro vyhledávání: '"Yu, BiHui"'
Autor:
Wei, Jingxuan, Tan, Cheng, Chen, Qi, Wu, Gaowei, Li, Siyuan, Gao, Zhangyang, Sun, Linzhuang, Yu, Bihui, Guo, Ruifeng
We introduce the task of text-to-diagram generation, which focuses on creating structured visual representations directly from textual descriptions. Existing approaches in text-to-image and text-to-code generation lack the logical organization and fl
Externí odkaz:
http://arxiv.org/abs/2411.11916
Autor:
Sun, Linzhuang, Liang, Hao, Wei, Jingxuan, Yu, Bihui, He, Conghui, Zhou, Zenan, Zhang, Wentao
Large Language Models (LLMs) have exhibited exceptional performance across a broad range of tasks and domains. However, they still encounter difficulties in solving mathematical problems due to the rigorous and logical nature of mathematics. Previous
Externí odkaz:
http://arxiv.org/abs/2409.17972
Autor:
Liang, Hao, Sun, Linzhuang, Wei, Jingxuan, Huang, Xijie, Sun, Linkun, Yu, Bihui, He, Conghui, Zhang, Wentao
In recent years, with the rapid advancements in large language models (LLMs), achieving excellent empathetic response capabilities has become a crucial prerequisite. Consequently, managing and understanding empathetic datasets have gained increasing
Externí odkaz:
http://arxiv.org/abs/2407.21669
In recent years, with the rapid advancements in large language models (LLMs), achieving excellent empathetic response capability has become a crucial prerequisite. Consequently, managing and understanding large-scale video datasets has gained increas
Externí odkaz:
http://arxiv.org/abs/2407.01937
Autor:
Tan, Cheng, Wei, Jingxuan, Sun, Linzhuang, Gao, Zhangyang, Li, Siyuan, Yu, Bihui, Guo, Ruifeng, Li, Stan Z.
Large language models equipped with retrieval-augmented generation (RAG) represent a burgeoning field aimed at enhancing answering capabilities by leveraging external knowledge bases. Although the application of RAG with language-only models has been
Externí odkaz:
http://arxiv.org/abs/2405.20834
Knowledge distillation, transferring knowledge from a teacher model to a student model, has emerged as a powerful technique in neural machine translation for compressing models or simplifying training targets. Knowledge distillation encompasses two p
Externí odkaz:
http://arxiv.org/abs/2404.14827
In the fields of computer vision and natural language processing, multimodal chart question-answering, especially involving color, structure, and textless charts, poses significant challenges. Traditional methods, which typically involve either direc
Externí odkaz:
http://arxiv.org/abs/2404.01548
Rational Sensibility: LLM Enhanced Empathetic Response Generation Guided by Self-presentation Theory
The development of Large Language Models (LLMs) provides human-centered Artificial General Intelligence (AGI) with a glimmer of hope. Empathy serves as a key emotional attribute of humanity, playing an irreplaceable role in human-centered AGI. Despit
Externí odkaz:
http://arxiv.org/abs/2312.08702
Knowledge distillation, a technique for model compression and performance enhancement, has gained significant traction in Neural Machine Translation (NMT). However, existing research primarily focuses on empirical applications, and there is a lack of
Externí odkaz:
http://arxiv.org/abs/2312.08585
The continuous development of artificial intelligence has a profound impact on biomedicine and other fields, providing new research ideas and technical methods. Brain-inspired computing is an important intersection between multimodal technology and b
Externí odkaz:
http://arxiv.org/abs/2312.07213