Zobrazeno 1 - 10
of 343
pro vyhledávání: '"Nayak, P. V."'
Autor:
Khandelwal, Apoorv, Yun, Tian, Nayak, Nihal V., Merullo, Jack, Bach, Stephen H., Sun, Chen, Pavlick, Ellie
Pre-training is notoriously compute-intensive and academic researchers are notoriously under-resourced. It is, therefore, commonly assumed that academics can't pre-train models. In this paper, we seek to clarify this assumption. We first survey acade
Externí odkaz:
http://arxiv.org/abs/2410.23261
We introduce Bonito, an open-source model for conditional task generation that converts unannotated text into task-specific training datasets for instruction tuning. We aim to enable zero-shot task adaptation of large language models on users' specia
Externí odkaz:
http://arxiv.org/abs/2402.18334
Autor:
Lewis, Martha, Nayak, Nihal V., Yu, Peilin, Yu, Qinan, Merullo, Jack, Bach, Stephen H., Pavlick, Ellie
Publikováno v:
In Findings of the Association for Computational Linguistics, EACL 2024, pages 1487 - 1500, Malta. Association for Computational Linguistics
Large-scale neural network models combining text and images have made incredible progress in recent years. However, it remains an open question to what extent such models encode compositional representations of the concepts over which they operate, s
Externí odkaz:
http://arxiv.org/abs/2212.10537
Evaluating clustering quality with reliable evaluation metrics like normalized mutual information (NMI) requires labeled data that can be expensive to annotate. We focus on the underexplored problem of estimating clustering quality with limited label
Externí odkaz:
http://arxiv.org/abs/2210.00064
We introduce compositional soft prompting (CSP), a parameter-efficient learning technique to improve the zero-shot compositionality of large-scale pretrained vision-language models (VLMs) like CLIP. We develop CSP for compositional zero-shot learning
Externí odkaz:
http://arxiv.org/abs/2204.03574
Autor:
Bach, Stephen H., Sanh, Victor, Yong, Zheng-Xin, Webson, Albert, Raffel, Colin, Nayak, Nihal V., Sharma, Abheesht, Kim, Taewoon, Bari, M Saiful, Fevry, Thibault, Alyafeai, Zaid, Dey, Manan, Santilli, Andrea, Sun, Zhiqing, Ben-David, Srulik, Xu, Canwen, Chhablani, Gunjan, Wang, Han, Fries, Jason Alan, Al-shaibani, Maged S., Sharma, Shanya, Thakker, Urmish, Almubarak, Khalid, Tang, Xiangru, Radev, Dragomir, Jiang, Mike Tian-Jian, Rush, Alexander M.
PromptSource is a system for creating, sharing, and using natural language prompts. Prompts are functions that map an example from a dataset to a natural language input and target output. Using prompts to train and query language models is an emergin
Externí odkaz:
http://arxiv.org/abs/2202.01279
Autor:
Piriyakulkij, Wasu, Menghini, Cristina, Briden, Ross, Nayak, Nihal V., Zhu, Jeffrey, Raisi, Elaheh, Bach, Stephen H.
Machine learning practitioners often have access to a spectrum of data: labeled data for the target task (which is often limited), unlabeled data, and auxiliary data, the many available labeled datasets for other tasks. We describe TAGLETS, a system
Externí odkaz:
http://arxiv.org/abs/2111.04798
Autor:
Nayak, Nihal V., Bach, Stephen H.
Zero-shot learning relies on semantic class representations such as hand-engineered attributes or learned embeddings to predict classes without any labeled examples. We propose to learn class representations by embedding nodes from common sense knowl
Externí odkaz:
http://arxiv.org/abs/2006.10713
Autor:
Isikdogan, Leo F, Nayak, Bhavin V, Wu, Chyuan-Tyng, Moreira, Joao Peralta, Rao, Sushma, Michael, Gilad
We propose a system comprised of fixed-topology neural networks having partially frozen weights, named SemifreddoNets. SemifreddoNets work as fully-pipelined hardware blocks that are optimized to have an efficient hardware implementation. Those block
Externí odkaz:
http://arxiv.org/abs/2006.06888
We propose fixed-function neural network hardware that is designed to perform pixel-to-pixel image transformations in a highly efficient way. We use a fully trainable, fixed-topology neural network to build a model that can perform a wide variety of
Externí odkaz:
http://arxiv.org/abs/2001.00630