Ranking LLMs by compression

Autor: Guo, Peijia, Li, Ziguang, Hu, Haibo, Huang, Chao, Li, Ming, Zhang, Rui
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: We conceptualize the process of understanding as information compression, and propose a method for ranking large language models (LLMs) based on lossless data compression. We demonstrate the equivalence of compression length under arithmetic coding with cumulative negative log probabilities when using a large language model as a prior, that is, the pre-training phase of the model is essentially the process of learning the optimal coding length. At the same time, the evaluation metric compression ratio can be obtained without actual compression, which greatly saves overhead. In this paper, we use five large language models as priors for compression, then compare their performance on challenging natural language processing tasks, including sentence completion, question answering, and coreference resolution. Experimental results show that compression ratio and model performance are positively correlated, so it can be used as a general metric to evaluate large language models.
Comment: 7 pages, 4 tables
Databáze: arXiv