Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Geewook Kim"'
Autor:
Geewook Kim, Teakgyu Hong, Moonbin Yim, JeongYeon Nam, Jinyoung Park, Jinyeong Yim, Wonseok Hwang, Sangdoo Yun, Dongyoon Han, Seunghyun Park
Publikováno v:
Lecture Notes in Computer Science ISBN: 9783031198144
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::ac9f99e2756dd777791b2b3a1f06d704
https://doi.org/10.1007/978-3-031-19815-1_29
https://doi.org/10.1007/978-3-031-19815-1_29
Publikováno v:
COLING
This paper introduces a method that efficiently reduces the computational cost and parameter size of Transformer. The proposed model, refer to as Group-Transformer, splits feature space into multiple groups, factorizes the calculation paths, and redu
Autor:
Sungrae Park, Geewook Kim, Junyeop Lee, Sangdoo Yun, Jeonghun Baek, Hwalsuk Lee, Dongyoon Han, Seong Joon Oh
Publikováno v:
ICCV
Many new proposals for scene text recognition (STR) models have been introduced in recent years. While each claim to have pushed the boundary of the technology, a holistic and fair comparison has been largely missing in the field due to the inconsist
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::a937edce67174924ecb30a073b43f0b8
http://arxiv.org/abs/1904.01906
http://arxiv.org/abs/1904.01906
Publikováno v:
NAACL-HLT (1)
We propose a new type of representation learning method that models words, phrases and sentences seamlessly. Our method does not depend on word segmentation and any human-annotated resources (e.g., word dictionaries), yet it is very effective for noi
Publikováno v:
IJCAI
We propose $\textit{weighted inner product similarity}$ (WIPS) for neural network-based graph embedding. In addition to the parameters of neural networks, we optimize the weights of the inner product by allowing positive and negative values. Despite
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::296b2cea2e4113e07016a36daa2b84d4
Publikováno v:
NUT@EMNLP
We propose a new word embedding method called word-like character n-gram embedding, which learns distributed representations of words by embedding word-like character n-grams. Our method is an extension of recently proposed segmentation-free word emb