Zobrazeno 1 - 3
of 3
pro vyhledávání: '"Robert Turko"'
Publikováno v:
Scopus-Elsevier
Why do large pre-trained transformer-based models perform so well across a wide variety of NLP tasks? Recent research suggests the key may lie in multi-headed attention mechanism's ability to learn and represent linguistic information. Understanding
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::bb6bbeb6d532f55020749ca943c7e8fc
Autor:
Zijie J. Wang, Nilaksh Das, Haekyu Park, Omar Shaikh, Minsuk Kahng, Fred Hohman, Robert Turko, Duen Horng Polo Chau
Publikováno v:
IEEE transactions on visualization and computer graphics. 27(2)
Deep learning's great success motivates many practitioners and students to learn about this exciting technology. However, it is often challenging for beginners to take their first step due to the complexity of understanding and applying deep learning
Autor:
Minsuk Kahng, Robert Turko, Haekyu Park, Duen Horng Chau, Fred Hohman, Nilaksh Das, Zijie J. Wang, Omar Shaikh
Publikováno v:
CHI Extended Abstracts
The success of deep learning solving previously-thought hard problems has inspired many non-experts to learn and understand this exciting technology. However, it is often challenging for learners to take the first steps due to the complexity of deep
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::b36412a568c06a33b13d719a739cc86b
http://arxiv.org/abs/2001.02004
http://arxiv.org/abs/2001.02004