Zobrazeno 1 - 10
of 219
pro vyhledávání: '"Shi, Yiming"'
The rapid growth of model scale has necessitated substantial computational resources for fine-tuning. Existing approach such as Low-Rank Adaptation (LoRA) has sought to address the problem of handling the large updated parameters in full fine-tuning.
Externí odkaz:
http://arxiv.org/abs/2410.13618
We introduce a FLORES+ dataset as an evaluation benchmark for modern Wu Chinese machine translation models and showcase its compatibility with existing Wu data. Wu Chinese is mutually unintelligible with other Sinitic languages such as Mandarin and Y
Externí odkaz:
http://arxiv.org/abs/2410.10278
Autor:
Sun, Chengwei, Wei, Jiwei, Wu, Yujia, Shi, Yiming, He, Shiyuan, Ma, Zeyu, Xie, Ning, Yang, Yang
Large pre-trained models (LPMs) have demonstrated exceptional performance in diverse natural language processing and computer vision tasks. However, fully fine-tuning these models poses substantial memory challenges, particularly in resource-constrai
Externí odkaz:
http://arxiv.org/abs/2409.05926
Neural language representation models such as GPT, pre-trained on large-scale corpora, can effectively capture rich semantic patterns from plain text and be fine-tuned to consistently improve natural language generation performance. However, existing
Externí odkaz:
http://arxiv.org/abs/2408.10130
Personalized text-to-image generation has gained significant attention for its capability to generate high-fidelity portraits of specific identities conditioned on user-defined prompts. Existing methods typically involve test-time fine-tuning or inst
Externí odkaz:
http://arxiv.org/abs/2408.06740
Unsupervised fault detection in multivariate time series is critical for maintaining the integrity and efficiency of complex systems, with current methodologies largely focusing on statistical and machine learning techniques. However, these approache
Externí odkaz:
http://arxiv.org/abs/2405.16258
Autor:
Jia, Junlong, Hu, Ying, Weng, Xi, Shi, Yiming, Li, Miao, Zhang, Xingjian, Zhou, Baichuan, Liu, Ziyu, Luo, Jie, Huang, Lei, Wu, Ji
We present TinyLLaVA Factory, an open-source modular codebase for small-scale large multimodal models (LMMs) with a focus on simplicity of code implementations, extensibility of new features, and reproducibility of training results. Following the des
Externí odkaz:
http://arxiv.org/abs/2405.11788
The objective of this design project is to use inkjet printers with conductive ink to print functional antennas. The project is sponsored by the University of Arizona Electrical and Computer Engineering department. The project was completed with a De
Externí odkaz:
http://hdl.handle.net/10150/624234
http://arizona.openrepository.com/arizona/handle/10150/624234
http://arizona.openrepository.com/arizona/handle/10150/624234
Autor:
Shi, Yiming, Shi, Haochen, Wang, Haichang, Chen, Chun-Jung, Li, Yaoyao, Qiao, Bo, Liang, Zhiqin, Zhao, Suling, Hang, Deyu, Xu, Zheng, Song, Dandan
Publikováno v:
In Chemical Engineering Journal 15 November 2024 500
Autor:
Zeng, Ziwei, Shi, Yiming, Cai, Yonghua, Yang, Xin, Zheng, Xiaobin, Huang, Liang, Liang, Zhenxing, Liu, Zhanzhen, Luo, Shuangling, Xiong, Li, Li, Shujuan, Liu, Zhihang, Kang, Liang, Liu, Huashan, Li, Wenxin
Publikováno v:
In Experimental Cell Research 1 November 2024 443(1)