Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Long Jikai"'
Autor:
Guo, Wentao, Long, Jikai, Zeng, Yimeng, Liu, Zirui, Yang, Xinyu, Ran, Yide, Gardner, Jacob R., Bastani, Osbert, De Sa, Christopher, Yu, Xiaodong, Chen, Beidi, Xu, Zhaozhuo
Zeroth-order optimization (ZO) is a memory-efficient strategy for fine-tuning Large Language Models using only forward passes. However, the application of ZO fine-tuning in memory-constrained settings such as mobile phones and laptops is still challe
Externí odkaz:
http://arxiv.org/abs/2406.02913
Given a Large Language Model (LLM) generation, how can we identify which training data led to this generation? In this paper, we proposed RapidIn, a scalable framework adapting to LLMs for estimating the influence of each training data. The proposed
Externí odkaz:
http://arxiv.org/abs/2405.11724
Autor:
Sun Shitong, Heng Jiaming, Pang Quanrui, Li Sheng, Pan Wenwen, Hu Ye, Song Wanying, Long Jikai
Publikováno v:
IOP Conference Series: Materials Science and Engineering. 423:012100
Autor:
Long Jikai, Heng Jiaming, Hu Ye, Pang Quanrui, Li Sheng, Sun Shitong, Pan Wenwen, Song Wanying
Publikováno v:
IOP Conference Series: Materials Science & Engineering; Nov2018, Vol. 423 Issue 1, p1-1, 1p