Zobrazeno 1 - 10
of 386
pro vyhledávání: '"WANG Sirui"'
Publikováno v:
Jixie qiangdu, Pp 871-878 (2023)
In view of the small number of data samples in the field test, a method of data expansion by using probability distribution and proxy model was proposed. This method determines the interval division and the number of sample points that need to be gen
Externí odkaz:
https://doaj.org/article/323e7c3efa564059bcc461129b07d2eb
Retrieval-Augmented Generation (RAG) demonstrates great value in alleviating outdated knowledge or hallucination by supplying LLMs with updated and relevant knowledge. However, there are still several difficulties for RAG in understanding complex mul
Externí odkaz:
http://arxiv.org/abs/2404.14043
Autor:
Pan, Ruotong, Cao, Boxi, Lin, Hongyu, Han, Xianpei, Zheng, Jia, Wang, Sirui, Cai, Xunliang, Sun, Le
The rapid development of large language models has led to the widespread adoption of Retrieval-Augmented Generation (RAG), which integrates external knowledge to alleviate knowledge bottlenecks and mitigate hallucinations. However, the existing RAG p
Externí odkaz:
http://arxiv.org/abs/2404.06809
The sequential recommendation task aims to predict the item that user is interested in according to his/her historical action sequence. However, inevitable random action, i.e. user randomly accesses an item among multiple candidates or clicks several
Externí odkaz:
http://arxiv.org/abs/2404.05342
Technological innovations are a major driver of economic development that depend on the exchange of knowledge and ideas among those with unique but complementary specialized knowledge and knowhow. However, measurement of specialized knowledge embedde
Externí odkaz:
http://arxiv.org/abs/2309.14451
The field of emotion recognition of conversation (ERC) has been focusing on separating sentence feature encoding and context modeling, lacking exploration in generative paradigms based on unified designs. In this study, we propose a novel approach, I
Externí odkaz:
http://arxiv.org/abs/2309.11911
Knowledge Base Question Answering (KBQA) aims to answer natural language questions with factual information such as entities and relations in KBs. However, traditional Pre-trained Language Models (PLMs) are directly pre-trained on large-scale natural
Externí odkaz:
http://arxiv.org/abs/2308.14436
Autor:
Wang, Keheng, Duan, Feiyu, Wang, Sirui, Li, Peiguang, Xian, Yunsen, Yin, Chuantao, Rong, Wenge, Xiong, Zhang
Equipped with Chain-of-Thought (CoT), Large language models (LLMs) have shown impressive reasoning ability in various downstream tasks. Even so, suffering from hallucinations and the inability to access external knowledge, LLMs often come with incorr
Externí odkaz:
http://arxiv.org/abs/2308.13259
Translating natural language queries into SQLs in a seq2seq manner has attracted much attention recently. However, compared with abstract-syntactic-tree-based SQL generation, seq2seq semantic parsers face much more challenges, including poor quality
Externí odkaz:
http://arxiv.org/abs/2306.08368
Publikováno v:
ICASSP 2023
Transformer-based pre-trained models have achieved great improvements in semantic matching. However, existing models still suffer from insufficient ability to capture subtle differences. The modification, addition and deletion of words in sentence pa
Externí odkaz:
http://arxiv.org/abs/2302.12530