Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Tyrolski, Michał"'
Autor:
Zawalski, Michał, Góral, Gracjan, Tyrolski, Michał, Wiśnios, Emilia, Budrowski, Franciszek, Kuciński, Łukasz, Miłoś, Piotr
Efficiently tackling combinatorial reasoning problems, particularly the notorious NP-hard tasks, remains a significant challenge for AI research. Recent efforts have sought to enhance planning by incorporating hierarchical high-level search strategie
Externí odkaz:
http://arxiv.org/abs/2406.03361
Autor:
Zawalski, Michał, Tyrolski, Michał, Czechowski, Konrad, Odrzygóźdź, Tomasz, Stachura, Damian, Piękos, Piotr, Wu, Yuhuai, Kuciński, Łukasz, Miłoś, Piotr
Complex reasoning problems contain states that vary in the computational cost required to determine a good action plan. Taking advantage of this property, we propose Adaptive Subgoal Search (AdaSubS), a search method that adaptively adjusts the plann
Externí odkaz:
http://arxiv.org/abs/2206.00702
Autor:
Nawrot, Piotr, Tworkowski, Szymon, Tyrolski, Michał, Kaiser, Łukasz, Wu, Yuhuai, Szegedy, Christian, Michalewski, Henryk
Transformer models yield impressive results on many NLP and sequence modeling tasks. Remarkably, Transformers can handle long sequences which allows them to produce long coherent outputs: full paragraphs produced by GPT-3 or well-structured images pr
Externí odkaz:
http://arxiv.org/abs/2110.13711
Autor:
Giziński, Stanisław, Preibisch, Grzegorz, Kucharski, Piotr, Tyrolski, Michał, Rembalski, Michał, Grzegorczyk, Piotr, Gambin, Anna
Publikováno v:
In Methods April 2024 224:1-9
Autor:
Nawrot, Piotr, Tworkowski, Szymon, Tyrolski, Michał, Kaiser, Łukasz, Wu, Yuhuai, Szegedy, Christian, Michalewski, Henryk
Publikováno v:
Findings of the Association for Computational Linguistics: NAACL 2022.
Transformer models yield impressive results on many NLP and sequence modeling tasks. Remarkably, Transformers can handle long sequences which allows them to produce long coherent outputs: full paragraphs produced by GPT-3 or well-structured images pr