Zobrazeno 1 - 10
of 518
pro vyhledávání: '"Dai, Xinyu"'
Autor:
Wang, Yidong, Guo, Qi, Yao, Wenjin, Zhang, Hongbo, Zhang, Xin, Wu, Zhen, Zhang, Meishan, Dai, Xinyu, Zhang, Min, Wen, Qingsong, Ye, Wei, Zhang, Shikun, Zhang, Yue
This paper introduces AutoSurvey, a speedy and well-organized methodology for automating the creation of comprehensive literature surveys in rapidly evolving fields like artificial intelligence. Traditional survey paper creation faces challenges due
Externí odkaz:
http://arxiv.org/abs/2406.10252
Large language models (LLMs) exhibit robust capabilities in text generation and comprehension, mimicking human behavior and exhibiting synthetic personalities. However, some LLMs have displayed offensive personality, propagating toxic discourse. Exis
Externí odkaz:
http://arxiv.org/abs/2406.04583
Multimodal Large Language Models (MLLMs) are widely regarded as crucial in the exploration of Artificial General Intelligence (AGI). The core of MLLMs lies in their capability to achieve cross-modal alignment. To attain this goal, current MLLMs typic
Externí odkaz:
http://arxiv.org/abs/2405.14129
Contrastive Language-Image Pre-training (CLIP) has shown powerful zero-shot learning performance. Few-shot learning aims to further enhance the transfer capability of CLIP by giving few images in each class, aka 'few shots'. Most existing methods eit
Externí odkaz:
http://arxiv.org/abs/2404.09778
Emotion Recognition in Conversation (ERC) involves detecting the underlying emotion behind each utterance within a conversation. Effectively generating representations for utterances remains a significant challenge in this task. Recent works propose
Externí odkaz:
http://arxiv.org/abs/2403.20289
Autor:
Pang, Taotian, Lou, Xingyu, Zhao, Fei, Wu, Zhen, Dong, Kuiyao, Peng, Qiuying, Qi, Yue, Dai, Xinyu
\textit{Knowledge-aware} recommendation methods (KGR) based on \textit{graph neural networks} (GNNs) and \textit{contrastive learning} (CL) have achieved promising performance. However, they fall short in modeling fine-grained user preferences and fu
Externí odkaz:
http://arxiv.org/abs/2403.16037
Autor:
Gao, Yuan, Zhu, Yiheng, Cao, Yuanbin, Zhou, Yinzhi, Wu, Zhen, Chen, Yujie, Wu, Shenglan, Hu, Haoyuan, Dai, Xinyu
Open Domain Multi-Hop Question Answering (ODMHQA) plays a crucial role in Natural Language Processing (NLP) by aiming to answer complex questions through multi-step reasoning over retrieved information from external knowledge sources. Recently, Large
Externí odkaz:
http://arxiv.org/abs/2403.12393
Autor:
Zheng, Kangjie, Long, Siyu, Lu, Tianyu, Yang, Junwei, Dai, Xinyu, Zhang, Ming, Nie, Zaiqing, Ma, Wei-Ying, Zhou, Hao
Protein language models have demonstrated significant potential in the field of protein engineering. However, current protein language models primarily operate at the residue scale, which limits their ability to provide information at the atom level.
Externí odkaz:
http://arxiv.org/abs/2403.12995
Autor:
Xing, Shangyu, Zhao, Fei, Wu, Zhen, An, Tuo, Chen, Weihao, Li, Chunhui, Zhang, Jianbing, Dai, Xinyu
Multimodal large language models (MLLMs) have attracted increasing attention in the past few years, but they may still generate descriptions that include objects not present in the corresponding images, a phenomenon known as object hallucination. To
Externí odkaz:
http://arxiv.org/abs/2402.09801
Large language models have shown impressive capabilities across a variety of NLP tasks, yet their generating text autoregressively is time-consuming. One way to speed them up is speculative decoding, which generates candidate segments (a sequence of
Externí odkaz:
http://arxiv.org/abs/2401.06706