Zobrazeno 1 - 10
of 443
pro vyhledávání: '"Wang, Yihang"'
Generative LLM have achieved significant success in various industrial tasks and can effectively adapt to vertical domains and downstream tasks through ICL. However, with tasks becoming increasingly complex, the context length required by ICL is also
Externí odkaz:
http://arxiv.org/abs/2408.10497
Although instruction tuning is widely used to adjust behavior in Large Language Models (LLMs), extensive empirical evidence and research indicates that it is primarily a process where the model fits to specific task formats, rather than acquiring new
Externí odkaz:
http://arxiv.org/abs/2408.10841
In-context learning (ICL) capabilities are foundational to the success of large language models (LLMs). Recently, context compression has attracted growing interest since it can largely reduce reasoning complexities and computation costs of LLMs. In
Externí odkaz:
http://arxiv.org/abs/2408.00274
Autor:
Wu, Yizhang, Li, Yuan, Liu, Yihan, Zhu, Dashuai, Xing, Sicheng, Lambert, Noah, Weisbecker, Hannah, Liu, Siyuan, Davis, Brayden, Zhang, Lin, Wang, Meixiang, Yuan, Gongkai, You, Chris Zhoufan, Zhang, Anran, Duncan, Cate, Xie, Wanrong, Wang, Yihang, Wang, Yong, Kanamurlapudi, Sreya, Evert, Garcia-Guzman, Putcha, Arjun, Dickey, Michael D., Huang, Ke, Bai, Wubin
Bioelectronic implants with soft mechanics, biocompatibility, and excellent electrical performance enable biomedical implants to record electrophysiological signals and execute interventions within internal organs, promising to revolutionize the diag
Externí odkaz:
http://arxiv.org/abs/2406.13956
Autor:
Wang, Yihang, Qiu, Yuying, Chen, Peng, Zhao, Kai, Shu, Yang, Rao, Zhongwen, Pan, Lujia, Yang, Bin, Guo, Chenjuan
With the increasing collection of time series data from various domains, there arises a strong demand for general time series forecasting models pre-trained on a large number of time-series datasets to support a variety of downstream prediction tasks
Externí odkaz:
http://arxiv.org/abs/2405.17478
Large Language Models (LLMs) have excelled in various tasks but perform better in high-resource scenarios, which presents challenges in low-resource scenarios. Data scarcity and the inherent difficulty of adapting LLMs to specific tasks compound the
Externí odkaz:
http://arxiv.org/abs/2404.00914
Autor:
Chen, Peng, Zhang, Yingying, Cheng, Yunyao, Shu, Yang, Wang, Yihang, Wen, Qingsong, Yang, Bin, Guo, Chenjuan
Transformers for time series forecasting mainly model time series from limited or fixed scales, making it challenging to capture different characteristics spanning various scales. We propose Pathformer, a multi-scale Transformer with adaptive pathway
Externí odkaz:
http://arxiv.org/abs/2402.05956
Maneuverability and drivability of the teleoperated ground vehicle could be seriously degraded by large communication delays if the delays are not properly compensated. This paper proposes a predicted trajectory guidance control (PTGC) framework to c
Externí odkaz:
http://arxiv.org/abs/2212.02706
While representation learning has been central to the rise of machine learning and artificial intelligence, a key problem remains in making the learned representations meaningful. For this, the typical approach is to regularize the learned representa
Externí odkaz:
http://arxiv.org/abs/2209.00905
Publikováno v:
In Bioresource Technology October 2024 409