Using Large Language Models to Generate Authentic Multi-agent Knowledge Work Datasets

Autor: Heim, Desiree, Jilek, Christian, Ulges, Adrian, Dengel, Andreas
Rok vydání: 2024
Předmět:
Zdroj: INFORMATIK 2024
Druh dokumentu: Working Paper
DOI: 10.18420/inf2024_118
Popis: Current publicly available knowledge work data collections lack diversity, extensive annotations, and contextual information about the users and their documents. These issues hinder objective and comparable data-driven evaluations and optimizations of knowledge work assistance systems. Due to the considerable resources needed to collect such data in real-life settings and the necessity of data censorship, collecting such a dataset appears nearly impossible. For this reason, we propose a configurable, multi-agent knowledge work dataset generator. This system simulates collaborative knowledge work among agents producing Large Language Model-generated documents and accompanying data traces. Additionally, the generator captures all background information, given in its configuration or created during the simulation process, in a knowledge graph. Finally, the resulting dataset can be utilized and shared without privacy or confidentiality concerns. This paper introduces our approach's design and vision and focuses on generating authentic knowledge work documents using Large Language Models. Our study involving human raters who assessed 53% of the generated and 74% of the real documents as realistic demonstrates the potential of our approach. Furthermore, we analyze the authenticity criteria mentioned in the participants' comments and elaborate on potential improvements for identified common issues.
Comment: Accepted and published (INFORMATIK Festival, Wiesbaden, 2024)
Databáze: arXiv