Give me Some Hard Questions: Synthetic Data Generation for Clinical QA

Autor: Bai, Fan, Harrigian, Keith, Stremmel, Joel, Hassanzadeh, Hamid, Saeedi, Ardavan, Dredze, Mark
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Clinical Question Answering (QA) systems enable doctors to quickly access patient information from electronic health records (EHRs). However, training these systems requires significant annotated data, which is limited due to the expertise needed and the privacy concerns associated with clinical data. This paper explores generating Clinical QA data using large language models (LLMs) in a zero-shot setting. We find that naive prompting often results in easy questions that do not reflect the complexity of clinical scenarios. To address this, we propose two prompting strategies: 1) instructing the model to generate questions that do not overlap with the input context, and 2) summarizing the input record using a predefined schema to scaffold question generation. Experiments on two Clinical QA datasets demonstrate that our method generates more challenging questions, significantly improving fine-tuning performance over baselines. We compare synthetic and gold data and find a gap between their training efficacy resulting from the quality of synthetically generated answers.
Comment: Accepted to ML4H 2024 Findings
Databáze: arXiv