Zobrazeno 1 - 8
of 8
pro vyhledávání: '"Das, Rocktim Jyoti"'
Medical task-oriented dialogue systems can assist doctors by collecting patient medical history, aiding in diagnosis, or guiding treatment selection, thereby reducing doctor burnout and expanding access to medical services. However, doctor-patient di
Externí odkaz:
http://arxiv.org/abs/2410.14204
End-to-end Task-Oriented Dialog (TOD) systems typically require extensive training datasets to perform well. In contrast, large language model (LLM) based TOD systems can excel even with limited data due to their ability to learn tasks through in-con
Externí odkaz:
http://arxiv.org/abs/2405.15585
Autor:
Lynn, Teresa, Altakrori, Malik H., Magdy, Samar Mohamed, Das, Rocktim Jyoti, Lyu, Chenyang, Nasr, Mohamed, Samih, Younes, Aji, Alham Fikri, Nakov, Preslav, Godbole, Shantanu, Roukos, Salim, Florian, Radu, Habash, Nizar
The rapid evolution of Natural Language Processing (NLP) has favored major languages such as English, leaving a significant gap for many others due to limited resources. This is especially evident in the context of data annotation, a task whose impor
Externí odkaz:
http://arxiv.org/abs/2404.17342
Autor:
Das, Rocktim Jyoti, Hristov, Simeon Emilov, Li, Haonan, Dimitrov, Dimitar Iliyanov, Koychev, Ivan, Nakov, Preslav
We introduce EXAMS-V, a new challenging multi-discipline multimodal multilingual exam benchmark for evaluating vision language models. It consists of 20,932 multiple-choice questions across 20 school disciplines covering natural science, social scien
Externí odkaz:
http://arxiv.org/abs/2403.10378
Autor:
Wang, Yuxia, Wang, Minghan, Manzoor, Muhammad Arslan, Liu, Fei, Georgiev, Georgi, Das, Rocktim Jyoti, Nakov, Preslav
Large language models (LLMs), especially when instruction-tuned for chat, have become part of our daily lives, freeing people from the process of searching, extracting, and integrating information from multiple sources by offering a straightforward a
Externí odkaz:
http://arxiv.org/abs/2402.02420
Large Language Models (LLMs) with billions of parameters are prime targets for network pruning, removing some model weights without hurting performance. Prior approaches such as magnitude pruning, SparseGPT, and Wanda, either concentrated solely on w
Externí odkaz:
http://arxiv.org/abs/2311.04902
Task-oriented dialog (TOD) agents often ground their responses on external knowledge bases (KBs). These KBs can be dynamic and may be updated frequently. Existing approaches for learning TOD agents assume the KB snapshot contemporary to each individu
Externí odkaz:
http://arxiv.org/abs/2305.16697
We systematically study how three large language models with code capabilities - CodeT5, Codex, and ChatGPT - generalize to out-of-domain data. We consider two fundamental applications - code summarization, and code generation. We split data into dom
Externí odkaz:
http://arxiv.org/abs/2303.09128