Zobrazeno 1 - 10
of 17
pro vyhledávání: '"Maurya, Avinash"'
Transformers and large language models~(LLMs) have seen rapid adoption in all domains. Their sizes have exploded to hundreds of billions of parameters and keep increasing. Under these circumstances, the training of transformers is very expensive and
Externí odkaz:
http://arxiv.org/abs/2410.21316
Transformers and LLMs have seen rapid adoption in all domains. Their sizes have exploded to hundreds of billions of parameters and keep increasing. Under these circumstances, the training of transformers is slow and often takes in the order of weeks
Externí odkaz:
http://arxiv.org/abs/2406.10728
LLMs have seen rapid adoption in all domains. They need to be trained on high-end high-performance computing (HPC) infrastructures and ingest massive amounts of input data. Unsurprisingly, at such a large scale, unexpected events (e.g., failures of c
Externí odkaz:
http://arxiv.org/abs/2406.10707
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Maurya, Avinash Chandra1, Verma, Sunil Kumar1, Kumar, Sushil2 sushilangrau@gmail.com, Lakra, Kairovin2
Publikováno v:
Journal of Applied & Natural Science. 2019, Vol. 11 Issue 2, p384-387. 4p.
Autor:
Maurya, Avinash1 (AUTHOR) avinashmaurya@nitp.ac.in, Mishra, Ambarisha1 (AUTHOR)
Publikováno v:
Renewable Energy Focus. Sep2022, Vol. 42, p33-47. 15p.
Autor:
Goswami, Gargi1 gargi.goswami1423@gmail.com, Singh, Yashwant2, Raghuveer, Munigela3, Maurya, Avinash Chandra4
Publikováno v:
Environment & Ecology. Jul-Sep2022, Vol. 40 Issue 3, p1098-1102. 5p.
Autor:
Maurya, Avinash, Mishra, Ambarisha
Publikováno v:
Engineering Research Express; Jun2022, Vol. 4 Issue 2, p1-13, 13p
Publikováno v:
2015 Annual IEEE India Conference (INDICON); 2015, p1-5, 5p