Evaluating the External and Parametric Knowledge Fusion of Large Language Models

Autor: Zhang, Hao, Zhang, Yuyang, Li, Xiaoguang, Shi, Wenxuan, Xu, Haonan, Liu, Huanshuo, Wang, Yasheng, Shang, Lifeng, Liu, Qun, Liu, Yong, Tang, Ruiming
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Integrating external knowledge into large language models (LLMs) presents a promising solution to overcome the limitations imposed by their antiquated and static parametric memory. Prior studies, however, have tended to over-reliance on external knowledge, underestimating the valuable contributions of an LLMs' intrinsic parametric knowledge. The efficacy of LLMs in blending external and parametric knowledge remains largely unexplored, especially in cases where external knowledge is incomplete and necessitates supplementation by their parametric knowledge. We propose to deconstruct knowledge fusion into four distinct scenarios, offering the first thorough investigation of LLM behavior across each. We develop a systematic pipeline for data construction and knowledge infusion to simulate these fusion scenarios, facilitating a series of controlled experiments. Our investigation reveals that enhancing parametric knowledge within LLMs can significantly bolster their capability for knowledge integration. Nonetheless, we identify persistent challenges in memorizing and eliciting parametric knowledge, and determining parametric knowledge boundaries. Our findings aim to steer future explorations on harmonizing external and parametric knowledge within LLMs.
Comment: 15 pages, 3 figures, 3 tables
Databáze: arXiv