Zobrazeno 1 - 3
of 3
pro vyhledávání: '"Hongjin SU"'
Autor:
Hongjin SU, Jungo Kasai, Chen Henry Wu, Weijia Shi, Tianlu Wang, Jiayi Xin, Rui Zhang, Mari Ostendorf, Luke Zettlemoyer, Smith, Noah A., Tao Yu
Publikováno v:
Hongjin SU
Many recent approaches to natural language tasks are built on the remarkable abilities of large language models. Large language models can perform in-context learning, where they learn a new task from a few task demonstrations, without any parameter
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::e7b6e00a66a5367211bb3b9452cde0dd
Autor:
Hongjin SU, Weijia Shi, Jungo Kasai, Yizhong Wang, Yushi Hu, Mari Ostendorf, Wen-tau Yih, Smith, Noah A., Luke Zettlemoyer, Tao Yu
Publikováno v:
Hongjin SU
We introduce INSTRUCTOR, a new method for computing text embeddings given task instructions: every text input is embedded together with instructions explaining the use case (e.g., task and domain descriptions). Unlike encoders from prior work that ar
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::cfe85cf00e250927a133ff061da7fb26
Publikováno v:
ACL/IJCNLP (1)
Large pre-trained models such as BERT are known to improve different downstream NLP tasks, even when such a model is trained on a generic domain. Moreover, recent studies have shown that when large domain-specific corpora are available, continued pre