Zobrazeno 1 - 10
of 3 211
pro vyhledávání: '"liang, Lei"'
Autor:
Liu, Zhiqiang, Chen, Mingyang, Hua, Yin, Chen, Zhuo, Liu, Ziqi, Liang, Lei, Chen, Huajun, Zhang, Wen
Beyond-triple fact representations including hyper-relational facts with auxiliary key-value pairs, temporal facts with additional timestamps, and nested facts implying relationships between facts, are gaining significant attention. However, existing
Externí odkaz:
http://arxiv.org/abs/2411.07019
Autor:
Liang, Lei, Sun, Mengshu, Gui, Zhengke, Zhu, Zhongshu, Jiang, Zhouyu, Zhong, Ling, Qu, Yuan, Zhao, Peilong, Bo, Zhongpu, Yang, Jin, Xiong, Huaidong, Yuan, Lin, Xu, Jun, Wang, Zaoyang, Zhang, Zhiqiang, Zhang, Wen, Chen, Huajun, Chen, Wenguang, Zhou, Jun
The recently developed retrieval-augmented generation (RAG) technology has enabled the efficient construction of domain-specific applications. However, it also has limitations, including the gap between vector similarity and the relevance of knowledg
Externí odkaz:
http://arxiv.org/abs/2409.13731
Autor:
Zhang, Ningyu, Xi, Zekun, Luo, Yujie, Wang, Peng, Tian, Bozhong, Yao, Yunzhi, Zhang, Jintian, Deng, Shumin, Sun, Mengshu, Liang, Lei, Zhang, Zhiqiang, Zhu, Xiaowei, Zhou, Jun, Chen, Huajun
Knowledge representation has been a central aim of AI since its inception. Symbolic Knowledge Graphs (KGs) and neural Large Language Models (LLMs) can both represent knowledge. KGs provide highly accurate and explicit knowledge representation, but fa
Externí odkaz:
http://arxiv.org/abs/2409.07497
Autor:
Zhang, Jintian, Peng, Cheng, Sun, Mengshu, Chen, Xiang, Liang, Lei, Zhang, Zhiqiang, Zhou, Jun, Chen, Huajun, Zhang, Ningyu
Despite the recent advancements in Large Language Models (LLMs), which have significantly enhanced the generative capabilities for various NLP tasks, LLMs still face limitations in directly handling retrieval tasks. However, many practical applicatio
Externí odkaz:
http://arxiv.org/abs/2409.05152
Autor:
Wang, Xiaohan, Yang, Xiaoyan, Zhu, Yuqi, Shen, Yue, Wang, Jian, Wei, Peng, Liang, Lei, Gu, Jinjie, Chen, Huajun, Zhang, Ningyu
Large Language Models (LLMs) like GPT-4, MedPaLM-2, and Med-Gemini achieve performance competitively with human experts across various medical benchmarks. However, they still face challenges in making professional diagnoses akin to physicians, partic
Externí odkaz:
http://arxiv.org/abs/2408.12579
Multi-hop question answering is a challenging task with distinct industrial relevance, and Retrieval-Augmented Generation (RAG) methods based on large language models (LLMs) have become a popular approach to tackle this task. Owing to the potential i
Externí odkaz:
http://arxiv.org/abs/2407.13101
Knowledge Graph Embedding (KGE) is a common method for Knowledge Graphs (KGs) to serve various artificial intelligence tasks. The suitable dimensions of the embeddings depend on the storage and computing conditions of the specific application scenari
Externí odkaz:
http://arxiv.org/abs/2407.02779
Autor:
Zhang, Wen, Jin, Long, Zhu, Yushan, Chen, Jiaoyan, Huang, Zhiwei, Wang, Junjie, Hua, Yin, Liang, Lei, Chen, Huajun
Natural language question answering (QA) over structured data sources such as tables and knowledge graphs (KGs) have been widely investigated, for example with Large Language Models (LLMs). The main solutions include question to formal query parsing
Externí odkaz:
http://arxiv.org/abs/2406.18916
Autor:
Gan, Chunjing, Yang, Dan, Hu, Binbin, Zhang, Hanxiao, Li, Siyuan, Liu, Ziqi, Shen, Yue, Ju, Lin, Zhang, Zhiqiang, Gu, Jinjie, Liang, Lei, Zhou, Jun
In recent years, large language models (LLMs) have made remarkable achievements in various domains. However, the untimeliness and cost of knowledge updates coupled with hallucination issues of LLMs have curtailed their applications in knowledge inten
Externí odkaz:
http://arxiv.org/abs/2405.19893
Autor:
Zhang, Yichi, Hu, Binbin, Chen, Zhuo, Guo, Lingbing, Liu, Ziqi, Zhang, Zhiqiang, Liang, Lei, Chen, Huajun, Zhang, Wen
Knowledge graphs (KGs) provide reliable external knowledge for a wide variety of AI tasks in the form of structured triples. Knowledge graph pre-training (KGP) aims to pre-train neural networks on large-scale KGs and provide unified interfaces to enh
Externí odkaz:
http://arxiv.org/abs/2405.13085