Zobrazeno 1 - 3
of 3
pro vyhledávání: '"Csaki, Zoltan"'
Autor:
Jain, Swayambhoo, Raju, Ravi, Li, Bo, Csaki, Zoltan, Li, Jonathan, Liang, Kaizhao, Feng, Guoyao, Thakkar, Urmish, Sampat, Anand, Prabhakar, Raghu, Jairath, Sumati
Large Language Models (LLMs) have achieved remarkable advancements, but their monolithic nature presents challenges in terms of scalability, cost, and customization. This paper introduces the Composition of Experts (CoE), a modular compound AI system
Externí odkaz:
http://arxiv.org/abs/2412.01868
Autor:
Csaki, Zoltan, Li, Bo, Li, Jonathan, Xu, Qiantong, Pawakapan, Pian, Zhang, Leon, Du, Yun, Zhao, Hengyu, Hu, Changran, Thakker, Urmish
Despite the widespread availability of LLMs, there remains a substantial gap in their capabilities and availability across diverse languages. One approach to address these issues has been to take an existing pre-trained LLM and continue to train it o
Externí odkaz:
http://arxiv.org/abs/2404.05829
Recent large language models (LLM) exhibit sub-optimal performance on low-resource languages, as the training data of these models is usually dominated by English and other high-resource languages. Furthermore, it is challenging to train models for l
Externí odkaz:
http://arxiv.org/abs/2311.05741