Zobrazeno 1 - 10
of 18 737
pro vyhledávání: '"A. Muralidharan"'
Publikováno v:
Discover Oncology, Vol 13, Iss 1, Pp 1-23 (2022)
Abstract Purpose Metastatic spread of prostate cancer to the skeleton may result in debilitating bone pain. In this review, we address mechanisms underpinning the pathobiology of metastatic prostate cancer induced bone pain (PCIBP) that include sensi
Externí odkaz:
https://doaj.org/article/19f1f600843041ae916017cf971aab3e
Autor:
Muralidharan, Varun, Cline, James M.
It has been proposed that the accelerated expansion of the universe can be explained by the merging of our universe with baby universes, resulting in dark energy with a phantom-like equation of state. However, the evidence in favor of it did not incl
Externí odkaz:
http://arxiv.org/abs/2408.13306
Autor:
Sreenivas, Sharath Turuvekere, Muralidharan, Saurav, Joshi, Raviraj, Chochowski, Marcin, Patwary, Mostofa, Shoeybi, Mohammad, Catanzaro, Bryan, Kautz, Jan, Molchanov, Pavlo
We present a comprehensive report on compressing the Llama 3.1 8B and Mistral NeMo 12B models to 4B and 8B parameters, respectively, using pruning and distillation. We explore two distinct pruning strategies: (1) depth pruning and (2) joint hidden/at
Externí odkaz:
http://arxiv.org/abs/2408.11796
This project investigates the efficacy of Large Language Models (LLMs) in understanding and extracting scientific knowledge across specific domains and to create a deep learning framework: Knowledge AI. As a part of this framework, we employ pre-trai
Externí odkaz:
http://arxiv.org/abs/2408.04651
Autor:
Muralidharan, Saurav, Sreenivas, Sharath Turuvekere, Joshi, Raviraj, Chochowski, Marcin, Patwary, Mostofa, Shoeybi, Mohammad, Catanzaro, Bryan, Kautz, Jan, Molchanov, Pavlo
Large language models (LLMs) targeting different deployment scales and sizes are currently produced by training each variant from scratch; this is extremely compute-intensive. In this paper, we investigate if pruning an existing LLM and then re-train
Externí odkaz:
http://arxiv.org/abs/2407.14679
Spintronic-based neuromorphic hardware offers high-density and rapid data processing at nanoscale lengths by leveraging magnetic configurations like skyrmion and domain walls. Here, we present the maximal hardware implementation of a convolutional ne
Externí odkaz:
http://arxiv.org/abs/2407.08469
Autor:
Bolton, Elliot, Xiong, Betty, Muralidharan, Vijaytha, Schamroth, Joel, Muralidharan, Vivek, Manning, Christopher D., Daneshjou, Roxana
Large language models, such as GPT-4 and Med-PaLM, have shown impressive performance on clinical tasks; however, they require access to compute, are closed-source, and cannot be deployed on device. Mid-size models such as BioGPT-large, BioMedLM, LLaM
Externí odkaz:
http://arxiv.org/abs/2404.15894
Autor:
Low, Yen Sia, Jackson, Michael L., Hyde, Rebecca J., Brown, Robert E., Sanghavi, Neil M., Baldwin, Julian D., Pike, C. William, Muralidharan, Jananee, Hui, Gavin, Alexander, Natasha, Hassan, Hadeel, Nene, Rahul V., Pike, Morgan, Pokrzywa, Courtney J., Vedak, Shivam, Yan, Adam Paul, Yao, Dong-han, Zipursky, Amy R., Dinh, Christina, Ballentine, Philip, Derieg, Dan C., Polony, Vladimir, Chawdry, Rehan N., Davies, Jordan, Hyde, Brigham B., Shah, Nigam H., Gombar, Saurabh
Evidence to guide healthcare decisions is often limited by a lack of relevant and trustworthy literature as well as difficulty in contextualizing existing research for a specific patient. Large language models (LLMs) could potentially address both ch
Externí odkaz:
http://arxiv.org/abs/2407.00541
Autor:
Weber, Bent, Fuhrer, Michael S, Sheng, Xian-Lei, Yang, Shengyuan A, Thomale, Ronny, Shamim, Saquib, Molenkamp, Laurens W, Cobden, David, Pesin, Dmytro, Zandvliet, Harold J W, Bampoulis, Pantelis, Claessen, Ralph, Menges, Fabian R, Gooth, Johannes, Felser, Claudia, Shekhar, Chandra, Tadich, Anton, Zhao, Mengting, Edmonds, Mark T, Jia, Junxiang, Bieniek, Maciej, Väyrynen, Jukka I, Culcer, Dimitrie, Muralidharan, Bhaskaran, Nadeem, Muhammad
2D topological insulators promise novel approaches towards electronic, spintronic, and quantum device applications. This is owing to unique features of their electronic band structure, in which bulk-boundary correspondences enforces the existence of
Externí odkaz:
http://arxiv.org/abs/2406.14209
Autor:
Cai, Ruisi, Muralidharan, Saurav, Heinrich, Greg, Yin, Hongxu, Wang, Zhangyang, Kautz, Jan, Molchanov, Pavlo
Training modern LLMs is extremely resource intensive, and customizing them for various deployment scenarios characterized by limited compute and memory resources through repeated training is impractical. In this paper, we introduce Flextron, a networ
Externí odkaz:
http://arxiv.org/abs/2406.10260