Zobrazeno 1 - 10
of 77
pro vyhledávání: '"Donald Metzler"'
Publikováno v:
Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval.
Publikováno v:
ACM SIGIR Forum. 55:1-27
When experiencing an information need, users want to engage with a domain expert, but often turn to an information retrieval system, such as a search engine, instead. Classical information retrieval systems do not answer information needs directly, b
Autor:
Alyssa Lees, Vinh Q. Tran, Yi Tay, Jeffrey Sorensen, Jai Gupta, Donald Metzler, Lucy Vasserman
On the world wide web, toxic content detectors are a crucial line of defense against potentially hateful and offensive messages. As such, building highly effective classifiers that enable a safer internet is an important research area. Moreover, the
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::296282522a24425881a98c4051ae19cc
http://arxiv.org/abs/2202.11176
http://arxiv.org/abs/2202.11176
Publikováno v:
Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining.
Publikováno v:
WSDM
Large generative language models such as GPT-2 are well-known for their ability to generate text as well as their utility in supervised downstream tasks via fine-tuning. Its prevalence on the web, however, is still not well understood - if we run GPT
Publikováno v:
ACL/IJCNLP (Findings)
In the pursuit of a deeper understanding of a model's behaviour, there is recent impetus for developing suites of probes aimed at diagnosing models beyond simple metrics like accuracy or BLEU. This paper takes a step back and asks an important and ti
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::0054820b23a70753a0e8a4e07d039fc3
Publikováno v:
ACL/IJCNLP (1)
In the era of pre-trained language models, Transformers are the de facto choice of model architectures. While recent research has shown promise in entirely convolutional, or CNN, architectures, they have not been explored using the pre-train-fine-tun
Publikováno v:
ACL/IJCNLP (1)
There are two major classes of natural language grammar -- the dependency grammar that models one-to-one correspondences between words and the constituency grammar that models the assembly of one or several corresponded words. While previous unsuperv
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2ecd84c2d59a8f2537d939d118d3ebd4
http://arxiv.org/abs/2012.00857
http://arxiv.org/abs/2012.00857
Transformer model architectures have garnered immense interest lately due to their effectiveness across a range of domains like language, vision and reinforcement learning. In the field of natural language processing for example, Transformers have be
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::f6db57a3d0e1f1fccc07994657a8350e
http://arxiv.org/abs/2009.06732
http://arxiv.org/abs/2009.06732
Publikováno v:
KDD
Many modern recommender systems train their models based on a large amount of implicit user feedback data. Due to the inherent bias in this data (e.g., position bias), learning from it directly can lead to suboptimal models. Recently, unbiased learni