PPLqa: An Unsupervised Information-Theoretic Quality Metric for Comparing Generative Large Language Models

Autor: Friedland, Gerald, Huang, Xin, Cui, Yueying, Kapoor, Vishaal, Khetan, Ashish, Das, Sanjiv
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: We propose PPLqa, an easy to compute, language independent, information-theoretic metric to measure the quality of responses of generative Large Language Models (LLMs) in an unsupervised way, without requiring ground truth annotations or human supervision. The method and metric enables users to rank generative language models for quality of responses, so as to make a selection of the best model for a given task. Our single metric assesses LLMs with an approach that subsumes, but is not explicitly based on, coherence and fluency (quality of writing) and relevance and consistency (appropriateness of response) to the query. PPLqa performs as well as other related metrics, and works better with long-form Q\&A. Thus, PPLqa enables bypassing the lengthy annotation process required for ground truth evaluations, and it also correlates well with human and LLM rankings.
Databáze: arXiv