Zobrazeno 1 - 10
of 15
pro vyhledávání: '"Sascha Rothe"'
Autor:
Sascha Rothe, Hinrich Schütze
Publikováno v:
Computational Linguistics, Vol 43, Iss 3 (2017)
We present AutoExtend, a system that combines word embeddings with semantic resources by learning embeddings for non-word objects like synsets and entities and learning word embeddings that incorporate the semantic information from the resource. The
Externí odkaz:
https://doaj.org/article/0ebd564b7c6e4ea6b631c12d78ee8034
Publikováno v:
ACL/IJCNLP (2)
This paper presents a simple recipe to train state-of-the-art multilingual Grammatical Error Correction (GEC) models. We achieve this by first proposing a language-agnostic method to generate a large number of synthetic examples. The second ingredien
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::e14f5da79dbb4592b93ec90992548bff
http://arxiv.org/abs/2106.03830
http://arxiv.org/abs/2106.03830
Publikováno v:
ACL/IJCNLP (1)
Aralikatte, R, Narayan, S, Maynez, J, Rothe, S & McDonald, R 2021, Focus attention : Promoting faithfulness and diversity in summarization . in ACL-IJCNLP 2021-59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference . Association for Computational Linguistics, pp. 6078-6095, Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2021, Virtual, Online, 01/08/2021 . https://doi.org/10.18653/v1/2021.acl-long.474
Aralikatte, R, Narayan, S, Maynez, J, Rothe, S & McDonald, R 2021, Focus attention : Promoting faithfulness and diversity in summarization . in ACL-IJCNLP 2021-59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference . Association for Computational Linguistics, pp. 6078-6095, Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2021, Virtual, Online, 01/08/2021 . https://doi.org/10.18653/v1/2021.acl-long.474
Professional summaries are written with document-level information, such as the theme of the document, in mind. This is in contrast with most seq2seq decoders which simultaneously learn to focus on salient content, while deciding what to generate, at
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::9333312f70fafd66deb6825845736bf2
http://arxiv.org/abs/2105.11921
http://arxiv.org/abs/2105.11921
Publikováno v:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing.
Publikováno v:
NGT@ACL
We evaluate the performance of transformer encoders with various decoders for information organization through a new task: generation of section headings for Wikipedia articles. Our analysis shows that decoders containing attention mechanisms over th
Publikováno v:
EMNLP (1)
We propose Masker, an unsupervised text-editing method for style transfer. To tackle cases when no parallel source-target pairs are available, we train masked language models (MLMs) for both the source and the target domain. Then we find the text spa
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::1e640d348f59bb30dfb860967e8df5a3
Autor:
Hinrich Schütze, Sascha Rothe
Publikováno v:
Computational Linguistics. 43:593-617
We present AutoExtend, a system that combines word embeddings with semantic resources by learning embeddings for non-word objects like synsets and entities and learning word embeddings that incorporate the semantic information from the resource. The
Publikováno v:
EMNLP/IJCNLP (1)
We propose LaserTagger - a sequence tagging approach that casts text generation as a text editing task. Target texts are reconstructed from the inputs using three main edit operations: keeping a token, deleting it, and adding a phrase before the toke
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::97c04c75a23235dce7b2dc678cc5f656
Publikováno v:
Transactions of the Association for Computational Linguistics, Vol 8, Pp 264-280 (2020)
Unsupervised pre-training of large neural models has recently revolutionized Natural Language Processing. By warm-starting from the publicly released checkpoints, NLP practitioners have pushed the state-of-the-art on multiple benchmarks while saving
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::65c4c13c51ee03e8cc85a2733c1de6ea
Publikováno v:
CoNLL
Motivated by recent findings on the probabilistic modeling of acceptability judgments, we propose syntactic log-odds ratio (SLOR), a normalized language model score, as a metric for referenceless fluency evaluation of natural language generation outp