Zobrazeno 1 - 10
of 32
pro vyhledávání: '"Sara Stymne"'
Publikováno v:
Computational Linguistics, Vol 46, Iss 4, Pp 763-784 (2021)
AbstractThere is a growing interest in investigating what neural NLP models learn about language. A prominent open question is the question of whether or not it is necessary to model hierarchical structure. We present a linguistic investigation of a
Externí odkaz:
https://doaj.org/article/7b4d6e78cc6e49f294e0664fb59690c8
Autor:
Agata, Savary, Cherifa Ben Khelil, Carlos, Ramisch, Voula, Giouli, Verginica Barbu Mititelu, Najet Hadj Mohamed, Cvetana, Krstev, Chaya, Liebeskind, Hongzhi, Xu, Sara, Stymne, Tunga, Güngör, Thomas, Pickard, Bruno, Guillaume, Eduard, Bejček, Archna, Bhatia, Marie, Candito, Polona, Gantar, Uxoa, Iñurrieta, Albert, Gatt, Jolanta, Kovalevskaite, Timm, Lichte, Nikola, Ljubešić, Monti, Johanna, Carla Parra Escartín, Mehrnoush, Shamsfard, Ivelina, Stoyanova, Veronika, Vincze, Abigail, Walsh
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=od______3684::4a6b68a3252f2673e7b5abbea37d5f4a
https://hdl.handle.net/11574/216660
https://hdl.handle.net/11574/216660
Autor:
Sebastian Reimann, Sara Stymne
Finding causal relations in text is an important task for many types of textual analysis. It is a challenging task, especially for the many languages with no or only little annotated training data available. To overcome this issue, we explore cross-l
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::861252533db066e062d63c06146ea808
http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-482423
http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-482423
Autor:
Rafal Cerniavski, Sara Stymne
Publikováno v:
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022).
Publikováno v:
SemEval@ACL/IJCNLP
We describe the Uppsala NLP submission to SemEval-2021 Task 2 on multilingual and cross-lingual word-in-context disambiguation. We explore the usefulness of three pre-trained multilingual language models, XLM-RoBERTa (XLMR), Multilingual BERT (mBERT)
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::1fb913c206b91da809f1c92ab02bf73b
Publikováno v:
Ruby, A, Hardmeier, C & Stymne, S 2021, A mention-based system for revision requirements detection . in Proceedings of the First Workshop on Understanding Implicit and Underspecified Language (UnImplicit) . Association for Computational Linguistics, pp. 58-63 . < https://aclanthology.org/2021.unimplicit-1.7.pdf >
Exploring aspects of sentential meaning that are implicit or underspecified in context is important for sentence understanding. In this paper, we propose a novel architecture based on mentions for revision requirements detection. The goal is to impro
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::cdb08e152adb23ea86268bd55c6c0b9f
https://pure.itu.dk/portal/da/publications/130119f2-e6f5-4b18-a96b-6380f9261ac9
https://pure.itu.dk/portal/da/publications/130119f2-e6f5-4b18-a96b-6380f9261ac9
Autor:
Sara Stymne, Giuseppe Della Corte
Publikováno v:
Proceedings of the First International Workshop on Natural Language Processing Beyond Text.
Autor:
Sara Stymne
Publikováno v:
TLT
We show how we can adapt parsing to low-resource domains by combining treebanks across languages for a parser model with treebank embeddings. We demonstrate how we can take advantage of in-domain treebanks from other languages, and show that this is
Publikováno v:
de Lhoneux, M, Stymne, S & Nivre, J 2020, ' What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions? ', Computational Linguistics, vol. 46, no. 4, pp. 763-784 . https://doi.org/10.1162/coli_a_00392
There is a growing interest in investigating what neural NLP models learn about language. A prominent open question is the question of whether or not it is necessary to model hierarchical structure. We present a linguistic investigation of a neural p
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::a83d5eec7fc8d54f77452e347118f0f1
http://arxiv.org/abs/1907.07950
http://arxiv.org/abs/1907.07950
Publikováno v:
WMT
Šoštarić, M, Hardmeier, C & Stymne, S 2018, Discourse-Related Language Contrasts in English-Croatian Human and Machine Translation . in Proceedings of the Third Conference on Machine Translation: Research Papers . Belgium, Brussels, pp. 36-48, EMNLP 2018 Third Conference on Machine Translation (WMT18), Brussels, Belgium, 31/10/18 . https://doi.org/10.18653/v1/W18-6305
Šoštarić, M, Hardmeier, C & Stymne, S 2018, Discourse-Related Language Contrasts in English-Croatian Human and Machine Translation . in Proceedings of the Third Conference on Machine Translation: Research Papers . Belgium, Brussels, pp. 36-48, EMNLP 2018 Third Conference on Machine Translation (WMT18), Brussels, Belgium, 31/10/18 . https://doi.org/10.18653/v1/W18-6305
We present an analysis of a number of coreference phenomena in English-Croatian human and machine translations. The aim is to shed light on the differences in the way these structurally different languages make use of discourse information and provid