Zobrazeno 1 - 10
of 121
pro vyhledávání: '"Ji, Yixin"'
Autor:
Ji, Yixin, Xiang, Yang, Li, Juntao, Xia, Qingrong, Li, Ping, Duan, Xinyu, Wang, Zhefeng, Zhang, Min
As large language models (LLMs) are widely applied across various fields, model compression has become increasingly crucial for reducing costs and improving inference efficiency. Post-training pruning is a promising method that does not require resou
Externí odkaz:
http://arxiv.org/abs/2410.17711
Autor:
Chen, Kang, Zhang, Qingheng, Lian, Chengbao, Ji, Yixin, Liu, Xuwei, Han, Shuguang, Wu, Guoqiang, Huang, Fei, Chen, Jufeng
Unlike professional Business-to-Consumer (B2C) e-commerce platforms (e.g., Amazon), Consumer-to-Consumer (C2C) platforms (e.g., Facebook marketplace) are mainly targeting individual sellers who usually lack sufficient experience in e-commerce. Indivi
Externí odkaz:
http://arxiv.org/abs/2410.16977
Large Language Models (LLMs) have demonstrated an impressive capability known as In-context Learning (ICL), which enables them to acquire knowledge from textual demonstrations without the need for parameter updates. However, many studies have highlig
Externí odkaz:
http://arxiv.org/abs/2406.01224
In recent years, large language models (LLMs) have driven advances in natural language processing. Still, their growing scale has increased the computational burden, necessitating a balance between efficiency and performance. Low-rank compression, a
Externí odkaz:
http://arxiv.org/abs/2405.10616
Autor:
Qiao, Dan, Su, Yi, Wang, Pinzheng, Ye, Jing, Xie, Wenjing, Zhou, Yuechi, Ding, Yuyang, Tang, Zecheng, Wang, Jikai, Ji, Yixin, Wang, Yue, Guo, Pei, Sun, Zechen, Zhang, Zikang, Li, Juntao, Chao, Pingfu, Chen, Wenliang, Fu, Guohong, Zhou, Guodong, Zhu, Qiaoming, Zhang, Min
Large Language Models (LLMs) have played an important role in many fields due to their powerful capabilities.However, their massive number of parameters leads to high deployment requirements and incurs significant inference costs, which impedes their
Externí odkaz:
http://arxiv.org/abs/2405.05957
Currently, pre-trained language models (PLMs) do not cope well with the distribution shift problem, resulting in models trained on the training set failing in real test scenarios. To address this problem, the test-time adaptation (TTA) shows great po
Externí odkaz:
http://arxiv.org/abs/2304.12764
Publikováno v:
In Journal of Environmental Chemical Engineering October 2024 12(5)
Autor:
Li, Chunhua, Bao, Luqian, Ji, Yixin, Tian, Zhehang, Cui, Mengyao, Shi, Yubo, Zhao, Zhilei, Wang, Xianyou
Publikováno v:
In Coordination Chemistry Reviews 1 September 2024 514
Autor:
Tian, Zhehang, Zhang, Xieyang, Zhang, Yuting, Wu, Zimeng, Luan, Guanqun, Bao, Luqian, Ji, Yixin, Cui, Mengyao, Li, Chunhua
Publikováno v:
In Food Chemistry 1 December 2024 460 Part 3
Autor:
Bao, Luqian, Tian, Zhehang, Hu, Xiaoyu, Li, Mai, Ji, Yixin, Cui, Mengyao, Wang, Xianyou, Li, Chunhua
Publikováno v:
In Journal of Water Process Engineering August 2024 65