Scaling Down to Scale Up: A Cost-Benefit Analysis of Replacing OpenAI's LLM with Open Source SLMs in Production

Autor: Irugalbandara, Chandra, Mahendra, Ashish, Daynauth, Roland, Arachchige, Tharuka Kasthuri, Dantanarayana, Jayanaka, Flautner, Krisztian, Tang, Lingjia, Kang, Yiping, Mars, Jason
Rok vydání: 2023
Předmět:
Zdroj: ISPASS-2024: 2024 IEEE International Symposium on Performance Analysis of Systems and Software
Druh dokumentu: Working Paper
Popis: Many companies use large language models (LLMs) offered as a service, like OpenAI's GPT-4, to create AI-enabled product experiences. Along with the benefits of ease-of-use and shortened time-to-solution, this reliance on proprietary services has downsides in model control, performance reliability, uptime predictability, and cost. At the same time, a flurry of open-source small language models (SLMs) has been made available for commercial use. However, their readiness to replace existing capabilities remains unclear, and a systematic approach to holistically evaluate these SLMs is not readily available. This paper presents a systematic evaluation methodology and a characterization of modern open-source SLMs and their trade-offs when replacing proprietary LLMs for a real-world product feature. We have designed SLaM, an open-source automated analysis tool that enables the quantitative and qualitative testing of product features utilizing arbitrary SLMs. Using SLaM, we examine the quality and performance characteristics of modern SLMs relative to an existing customer-facing implementation using the OpenAI GPT-4 API. Across 9 SLMs and their 29 variants, we observe that SLMs provide competitive results, significant performance consistency improvements, and a cost reduction of 5x~29x when compared to GPT-4.
Comment: Updated title, Revised content
Databáze: arXiv