Zobrazeno 1 - 10
of 10 219
pro vyhledávání: '"WEI, Ying"'
Model editing aims to data-efficiently correct predictive errors of large pre-trained models while ensuring generalization to neighboring failures and locality to minimize unintended effects on unrelated examples. While significant progress has been
Externí odkaz:
http://arxiv.org/abs/2411.01948
Low-Rank Adaptation (LoRA) is a parameter-efficient technique for rapidly fine-tuning foundation models. In standard LoRA training dynamics, models tend to quickly converge to a local optimum near the initialization. However, this local optimum may n
Externí odkaz:
http://arxiv.org/abs/2410.22911
Autor:
Hua, Muchuan, Chen, Wei-Ying, Hou, Hanyu, Kolluru, Venkata Surya Chaitanya, Chan, Maria K. Y., Liu, HaiHua, Gage, Thomas E., Zuo, Jian-Min, Diroll, Benjamin T., Wen, Jianguo
Deterministic creation of quantum emitters with high single-photon-purity and excellent indistinguishability is essential for practical applications in quantum information science. Many successful attempts have been carried out in hexagonal boron nit
Externí odkaz:
http://arxiv.org/abs/2410.13169
Autor:
Wang, Zhaoyang, He, Weilei, Liang, Zhiyuan, Zhang, Xuchao, Bansal, Chetan, Wei, Ying, Zhang, Weitong, Yao, Huaxiu
Recent self-rewarding large language models (LLM) have successfully applied LLM-as-a-Judge to iteratively improve the alignment performance without the need of human annotations for preference data. These methods commonly utilize the same LLM to act
Externí odkaz:
http://arxiv.org/abs/2410.12735
Molecular generation and molecular property prediction are both crucial for drug discovery, but they are often developed independently. Inspired by recent studies, which demonstrate that diffusion model, a prominent generative approach, can learn mea
Externí odkaz:
http://arxiv.org/abs/2410.10516
Large Language Models (LLMs) have recently revolutionized the NLP field, while they still fall short in some specific down-stream tasks. In the work, we focus on utilizing LLMs to perform machine translation, where we observe that two patterns of err
Externí odkaz:
http://arxiv.org/abs/2410.07054
Autor:
Wang, Jian, Wei, Ying, Mao, Baohong, Xu, Wenjing, Liu, Xiyun, Tao, Chengbing, Lu, Yongli, Sheng, Yannan, Liu, Qing
Publikováno v:
African Journal of Reproductive Health / La Revue Africaine de la Santé Reproductive, 2024 Sep 01. 28(9), 180-190.
Externí odkaz:
https://www.jstor.org/stable/27332752
Autor:
Ni, Yuyan, Feng, Shikun, Hong, Xin, Sun, Yuancheng, Ma, Wei-Ying, Ma, Zhi-Ming, Ye, Qiwei, Lan, Yanyan
Deep learning methods have been considered promising for accelerating molecular screening in drug discovery and material design. Due to the limited availability of labelled data, various self-supervised molecular pre-training methods have been presen
Externí odkaz:
http://arxiv.org/abs/2407.11086
Autor:
Jiang, Gangwei, Jiang, Caigao, Li, Zhaoyi, Xue, Siqiao, Zhou, Jun, Song, Linqi, Lian, Defu, Wei, Ying
Fine-tuning large language models (LLMs) can cause them to lose their general capabilities. However, the intrinsic mechanisms behind such forgetting remain unexplored. In this paper, we begin by examining this phenomenon by focusing on knowledge unde
Externí odkaz:
http://arxiv.org/abs/2406.12227
Autor:
Gao, Bowen, Tan, Haichuan, Huang, Yanwen, Ren, Minsi, Huang, Xiao, Ma, Wei-Ying, Zhang, Ya-Qin, Lan, Yanyan
Recent advancements in structure-based drug design (SBDD) have significantly enhanced the efficiency and precision of drug discovery by generating molecules tailored to bind specific protein pockets. Despite these technological strides, their practic
Externí odkaz:
http://arxiv.org/abs/2406.08980