Zobrazeno 1 - 10
of 2 286
pro vyhledávání: '"LIU Shiwei"'
Publikováno v:
Shipin Kexue, Vol 44, Iss 20, Pp 252-259 (2023)
Ginseng extract was fermented by the probiotic Lactobacillus plantarum to investigate the changes in its active ingredients and antioxidant activities before and after fermentation. The contents of total phenols, polysaccharides, total flavonoids and
Externí odkaz:
https://doaj.org/article/c4e77be41ae043c3956cb8118a9ebd30
Publikováno v:
Xiehe Yixue Zazhi, Vol 14, Iss 3, Pp 543-552 (2023)
Objective To investigate the mechanism of Vaspin in improving the pancreatic beta cell function in type 2 diabetic (T2DM) rats. Methods The diabetic rat model was established by feeding high-fat and high-sugar diet combined with intraperitoneal injec
Externí odkaz:
https://doaj.org/article/5e880b720fe244a59882cf52233ed6fd
Publikováno v:
Xiehe Yixue Zazhi, Vol 12, Iss 6, Pp 1009-1015 (2021)
Acral malignant melanoma, as the main type of melanoma in China, is commonly seen in plantar lesions. Due to similar clinical manifestation, it is easy to be misdiagnosed as diabetic foot ulcers. We report a case of diabetes mellitus complicated with
Externí odkaz:
https://doaj.org/article/afef9bc2969b4eb89d3f163994a62d92
Recent work on pruning large language models (LLMs) has shown that one can eliminate a large number of parameters without compromising performance, making pruning a promising strategy to reduce LLM model size. Existing LLM pruning strategies typicall
Externí odkaz:
http://arxiv.org/abs/2410.10912
This paper investigates the under-explored area of low-rank weight training for large-scale Conformer-based speech recognition models from scratch. Our study demonstrates the viability of this training paradigm for such models, yielding several notab
Externí odkaz:
http://arxiv.org/abs/2410.07771
Autor:
Bandari, Abhinav, Yin, Lu, Hsieh, Cheng-Yu, Jaiswal, Ajay Kumar, Chen, Tianlong, Shen, Li, Krishna, Ranjay, Liu, Shiwei
Network pruning has emerged as a potential solution to make LLMs cheaper to deploy. However, existing LLM pruning approaches universally rely on the C4 dataset as the calibration data for calculating pruning scores, leaving its optimality unexplored.
Externí odkaz:
http://arxiv.org/abs/2410.07461
Publikováno v:
Green Processing and Synthesis, Vol 10, Iss 1, Pp 189-200 (2021)
Thermoanalysis was used in this research to produce a comparative study on the combustion and gasification characteristics of semi-coke prepared under pyrolytic atmospheres rich in CH4 and H2 at different proportions. Distinctions of different semi-c
Externí odkaz:
https://doaj.org/article/33f02813014040d4ac76492fa5647622
Autor:
Huang, Tianjin, Meng, Fang, Shen, Li, Liu, Fan, Pei, Yulong, Pechenizkiy, Mykola, Liu, Shiwei, Chen, Tianlong
Large-scale neural networks have demonstrated remarkable performance in different domains like vision and language processing, although at the cost of massive computation resources. As illustrated by compression literature, structural model pruning i
Externí odkaz:
http://arxiv.org/abs/2407.17412
Autor:
Jaiswal, Ajay, Yin, Lu, Zhang, Zhenyu, Liu, Shiwei, Zhao, Jiawei, Tian, Yuandong, Wang, Zhangyang
Modern Large Language Models (LLMs) are composed of matrices with billions of elements, making their storage and processing quite demanding in terms of computational resources and memory usage. Being significantly large, such matrices can often be ex
Externí odkaz:
http://arxiv.org/abs/2407.11239
Autor:
Zhang, Zhenyu, Jaiswal, Ajay, Yin, Lu, Liu, Shiwei, Zhao, Jiawei, Tian, Yuandong, Wang, Zhangyang
Training Large Language Models (LLMs) is memory-intensive due to the large number of parameters and associated optimization states. GaLore, a recent method, reduces memory usage by projecting weight gradients into a low-rank subspace without compromi
Externí odkaz:
http://arxiv.org/abs/2407.08296