Zobrazeno 1 - 10
of 31
pro vyhledávání: '"Bill Dolan"'
Autor:
Ryan Volum, Sudha Rao, Michael Xu, Gabriel DesGarennes, Chris Brockett, Benjamin Van Durme, Olivia Deng, Akanksha Malhotra, Bill Dolan
Publikováno v:
Proceedings of the 3rd Wordplay: When Language Meets Games Workshop (Wordplay 2022).
Large pretrained generative models like GPT-3 often suffer from hallucinating non-existent or incorrect content, which undermines their potential merits in real applications. Existing work usually attempts to detect these hallucinations based on a co
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::0905a324e07610518dbc6cd6fbfb91f1
http://arxiv.org/abs/2104.08704
http://arxiv.org/abs/2104.08704
Autor:
Yizhe Zhang, Siqi Sun, Xiang Gao, Yuwei Fang, Chris Brockett, Michel Galley, Jianfeng Gao, Bill Dolan
Recent advances in large-scale pre-training such as GPT-3 allow seemingly high quality text to be generated from a given prompt. However, such generation systems often suffer from problems of hallucinated facts, and are not inherently designed to inc
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::416976196477cde44dcc6fdf9de125ba
Autor:
Chris Brockett, Felix Faltings, Michel Galley, Bill Dolan, Jianfeng Gao, Gerold Hintz, Chris Quirk
Publikováno v:
NAACL-HLT
A prevailing paradigm in neural text generation is one-shot generation, where text is produced in a single step. The one-shot setting is inadequate, however, when the constraints the user wishes to impose on the generated text are dynamic, especially
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::b3ce113ae4a53b01ce0bd7a43d8a5a58
http://arxiv.org/abs/2010.12826
http://arxiv.org/abs/2010.12826
Publikováno v:
NAACL-HLT
Adversarial examples expose the vulnerabilities of natural language processing (NLP) models, and can be used to evaluate and improve their robustness. Existing techniques of generating such examples are typically driven by local heuristic rules that
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::78d32f204edce85e345d205fe51f022c
http://arxiv.org/abs/2009.07502
http://arxiv.org/abs/2009.07502
Publikováno v:
CVPR Workshops
Self-supervised pretraining has become a strong force in both language and vision tasks. Current efforts to improve the effects of pretraining focus on improving network architecture or defining new tasks to extract representations from the data. We
Publikováno v:
EMNLP (1)
Large-scale pre-trained language models, such as BERT and GPT-2, have achieved excellent performance in language representation learning and free-form text generation. However, these models cannot be directly employed to generate text under specified
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::6c028409af1c6d8678444c172c50fefa
http://arxiv.org/abs/2005.00558
http://arxiv.org/abs/2005.00558
Autor:
Chris Brockett, Elnaz Nouri, Sudha Rao, Angela S. Lin, Asli Celikyilmaz, Debadeepta Dey, Bill Dolan
Publikováno v:
ACL
Many high-level procedural tasks can be decomposed into sequences of instructions that vary in their order and choice of tools. In the cooking domain, the web offers many partially-overlapping text and video recipes (i.e. procedures) that describe ho
Publikováno v:
ACL (demo)
We present MixingBoard, a platform for quickly building demos with a focus on knowledge grounded stylized text generation. We unify existing text generation algorithms in a shared codebase and further adapt earlier algorithms for constrained generati
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::9a3a9ed906d10c650dba3666b9b89786
Publikováno v:
EMNLP (1)
Existing language models excel at writing from scratch, but many real-world scenarios require rewriting an existing document to fit a set of constraints. Although sentence-level rewriting has been fairly well-studied, little work has addressed the ch