Zobrazeno 1 - 1
of 1
pro vyhledávání: '"Arun, Archish"'
Autor:
Liang, Mengfei, Arun, Archish, Wu, Zekun, Munoz, Cristian, Lutch, Jonathan, Kazim, Emre, Koshiyama, Adriano, Treleaven, Philip
Hallucination, the generation of factually incorrect content, is a growing challenge in Large Language Models (LLMs). Existing detection and mitigation methods are often isolated and insufficient for domain-specific needs, lacking a standardized pipe
Externí odkaz:
http://arxiv.org/abs/2409.11353