Zobrazeno 1 - 10
of 17
pro vyhledávání: '"Talman, Aarne"'
Autor:
Luukkonen, Risto, Burdge, Jonathan, Zosa, Elaine, Talman, Aarne, Komulainen, Ville, Hatanpää, Väinö, Sarlin, Peter, Pyysalo, Sampo
The pretraining of state-of-the-art large language models now requires trillions of words of text, which is orders of magnitude more than available for the vast majority of languages. While including text in more than one language is an obvious way t
Externí odkaz:
http://arxiv.org/abs/2404.01856
This paper introduces Bayesian uncertainty modeling using Stochastic Weight Averaging-Gaussian (SWAG) in Natural Language Understanding (NLU) tasks. We apply the approach to standard tasks in natural language inference (NLI) and demonstrate the effec
Externí odkaz:
http://arxiv.org/abs/2304.04726
A central question in natural language understanding (NLU) research is whether high performance demonstrates the models' strong reasoning capabilities. We present an extensive series of controlled experiments where pre-trained language models are exp
Externí odkaz:
http://arxiv.org/abs/2201.04467
Pre-trained neural language models give high performance on natural language inference (NLI) tasks. But whether they actually understand the meaning of the processed sequences remains unclear. We propose a new diagnostics test suite which allows to a
Externí odkaz:
http://arxiv.org/abs/2104.04751
Autor:
Talman, Aarne, Suni, Antti, Celikkanat, Hande, Kakouros, Sofoklis, Tiedemann, Jörg, Vainio, Martti
In this paper we introduce a new natural language processing dataset and benchmark for predicting prosodic prominence from written text. To our knowledge this will be the largest publicly available dataset with prosodic labels. We describe the datase
Externí odkaz:
http://arxiv.org/abs/1908.02262
Autor:
Talman, Aarne, Sulubacak, Umut, Vázquez, Raúl, Scherrer, Yves, Virpioja, Sami, Raganato, Alessandro, Hurskainen, Arvi, Tiedemann, Jörg
In this paper, we present the University of Helsinki submissions to the WMT 2019 shared task on news translation in three language pairs: English-German, English-Finnish and Finnish-English. This year, we focused first on cleaning and filtering the t
Externí odkaz:
http://arxiv.org/abs/1906.04040
Neural network models have been very successful in natural language inference, with the best models reaching 90% accuracy in some benchmarks. However, the success of these models turns out to be largely benchmark specific. We show that models trained
Externí odkaz:
http://arxiv.org/abs/1810.09774
Publikováno v:
Nat. Lang. Eng. 25 (2019) 467-482
Sentence-level representations are necessary for various NLP tasks. Recurrent neural networks have proven to be very effective in learning distributed representations and can be trained efficiently on natural language inference tasks. We build on top
Externí odkaz:
http://arxiv.org/abs/1808.08762
Neural network models have been very successful in natural language inference, with the best models reaching 90% accuracy in some benchmarks. However, the success of these models turns out to be largely benchmark specific. We show that models trained
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=od______1593::42035722f9d95db5b9bd2b73f693a7fb
http://hdl.handle.net/10138/304485
http://hdl.handle.net/10138/304485