TCMChat: A generative large language model for traditional Chinese medicine

Autor: Yizheng Dai, Xin Shao, Jinlu Zhang, Yulong Chen, Qian Chen, Jie Liao, Fei Chi, Junhua Zhang, Xiaohui Fan
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Pharmacological Research, Vol 210, Iss , Pp 107530- (2024)
Druh dokumentu: article
ISSN: 1096-1186
DOI: 10.1016/j.phrs.2024.107530
Popis: The utilization of ground-breaking large language models (LLMs) accompanied with dialogue system has been progressively prevalent in the medical domain. Nevertheless, the expertise of LLMs in Traditional Chinese Medicine (TCM) remains restricted despite several TCM LLMs proposed recently. Herein, we introduced TCMChat (https://xomics.com.cn/tcmchat), a generative LLM with pre-training (PT) and supervised fine-tuning (SFT) on large-scale curated TCM text knowledge and Chinese Question-Answering (QA) datasets. In detail, we first compiled a customized collection of six scenarios of Chinese medicine as the training set by text mining and manual verification, involving TCM knowledgebase, choice question, reading comprehension, entity extraction, medical case diagnosis, and herb or formula recommendation. Next, we subjected the model to PT and SFT, using the Baichuan2–7B-Chat as the foundation model. The benchmarking datasets and case studies further demonstrate the superior performance of TCMChat in comparison to existing models. Our code, data and model are publicly released on GitHub (https://github.com/ZJUFanLab/TCMChat) and HuggingFace (https://huggingface.co/ZJUFanLab), providing high-quality knowledgebase for the research of TCM modernization with a user-friendly dialogue web tool.
Databáze: Directory of Open Access Journals