Zobrazeno 1 - 8
of 8
pro vyhledávání: '"Jan Hula"'
Publikováno v:
Neural Computing and Applications. 34:12029-12041
Publikováno v:
IEEE Transactions on Fuzzy Systems. 28:1195-1204
Although data preprocessing is a universal technique that can be widely used in neural networks (NNs), most research in this area is focused on designing new NN architectures. This paper, we propose a preprocessing technique that enriches the origina
Publikováno v:
Information Processing and Management of Uncertainty in Knowledge-Based Systems ISBN: 9783031089732
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::8f1501890c30c040439b0728e96e0993
https://doi.org/10.1007/978-3-031-08974-9_34
https://doi.org/10.1007/978-3-031-08974-9_34
Publikováno v:
2021 IEEE 33rd International Conference on Tools with Artificial Intelligence (ICTAI).
Publikováno v:
2020 IEEE Third International Conference on Data Stream Mining & Processing (DSMP).
We describe a development of a custom OCR system, which is designed specifically for a linguistic analysis of texts printed during the early modern period. This analysis requires precise detection of individual graphemes, and we, therefore, could not
We present a new version of YOLO with better performance and extended with instance segmentation called Poly-YOLO. Poly-YOLO builds on the original ideas of YOLOv3 and removes two of its weaknesses: a large amount of rewritten labels and inefficient
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::e2f13e603c9d0f5a3e014e39e475cee7
http://arxiv.org/abs/2005.13243
http://arxiv.org/abs/2005.13243
Autor:
Vojtech Molek, Jan Hula
Publikováno v:
Data Science and Knowledge Engineering for Sensing Decision Support.
Autor:
Benjamin Van Durme, Ellie Pavlick, R. Thomas McCoy, Raghavendra Pappagari, Patrick Xia, Najoung Kim, Yinghui Huang, Katherin Yu, Roma Patel, Jan Hula, Edouard Grave, Shuning Jin, Ian Tenney, Samuel R. Bowman, Berlin Chen, Alex Wang
Publikováno v:
Scopus-Elsevier
ACL (1)
ACL (1)
Natural language understanding has recently seen a surge of progress with the use of sentence encoders like ELMo (Peters et al., 2018a) and BERT (Devlin et al., 2019) which are pretrained on variants of language modeling. We conduct the first large-s
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::08523946800da4a7d663cf8e3afe49f7
http://www.scopus.com/inward/record.url?eid=2-s2.0-85084066669&partnerID=MN8TOARS
http://www.scopus.com/inward/record.url?eid=2-s2.0-85084066669&partnerID=MN8TOARS