Zobrazeno 1 - 10
of 576
pro vyhledávání: '"CHEN Zhengyu"'
Publikováno v:
Open Geosciences, Vol 15, Iss 1, Pp 35-68 (2023)
This work presents an in-depth examination of the Carboniferous volcanic reservoir within the CH471 well area, situated in the central portion of the Hongche fault zone on the northwestern margin of the Junggar Basin. Leveraging seismic data and well
Externí odkaz:
https://doaj.org/article/1a95451d1c1248ad8873261567fc9ddc
Accurate remaining useful life (RUL) predictions are critical to the safe operation of aero-engines. Currently, the RUL prediction task is mainly a regression paradigm with only mean square error as the loss function and lacks research on feature spa
Externí odkaz:
http://arxiv.org/abs/2411.00461
Autor:
He, Bin, Ying, Yuzhe, Shi, Yejiong, Meng, Zhe, Yin, Zichen, Chen, Zhengyu, Hu, Zhangwei, Xue, Ruizhi, Jing, Linkai, Lu, Yang, Sun, Zhenxing, Man, Weitao, Wu, Youtu, Lei, Dan, Zhang, Ning, Wang, Guihuai, Xue, Ping
Current surgical procedures for spinal cord tumors lack in vivo high-resolution, high-speed multifunctional imaging systems, posing challenges for precise tumor resection and intraoperative decision-making. This study introduces the Fast Adaptive Foc
Externí odkaz:
http://arxiv.org/abs/2410.21809
Autor:
Ai, Yuang, Zhou, Xiaoqiang, Huang, Huaibo, Han, Xiaotian, Chen, Zhengyu, You, Quanzeng, Yang, Hongxia
Image restoration (IR) in real-world scenarios presents significant challenges due to the lack of high-capacity models and comprehensive datasets. To tackle these issues, we present a dual strategy: GenIR, an innovative data curation pipeline, and Dr
Externí odkaz:
http://arxiv.org/abs/2410.18666
Model merging has gained significant attention as a cost-effective approach to integrate multiple single-task fine-tuned models into a unified one that can perform well on multiple tasks. However, existing model merging techniques primarily focus on
Externí odkaz:
http://arxiv.org/abs/2410.13910
Textual Attributed Graphs (TAGs) are crucial for modeling complex real-world systems, yet leveraging large language models (LLMs) for TAGs presents unique challenges due to the gap between sequential text processing and graph-structured data. We intr
Externí odkaz:
http://arxiv.org/abs/2410.07074
The scaling of large language models (LLMs) is a critical research area for the efficiency and effectiveness of model training and deployment. Our work investigates the transferability and discrepancies of scaling laws between Dense Models and Mixtur
Externí odkaz:
http://arxiv.org/abs/2410.05661
Autor:
Xie, Qianqian, Li, Dong, Xiao, Mengxi, Jiang, Zihao, Xiang, Ruoyu, Zhang, Xiao, Chen, Zhengyu, He, Yueru, Han, Weiguang, Yang, Yuzhe, Chen, Shunian, Zhang, Yifei, Shen, Lihang, Kim, Daniel, Liu, Zhiwei, Luo, Zheheng, Yu, Yangyang, Cao, Yupeng, Deng, Zhiyang, Yao, Zhiyuan, Li, Haohang, Feng, Duanyu, Dai, Yongfu, Somasundaram, VijayaSai, Lu, Peng, Zhao, Yilun, Long, Yitao, Xiong, Guojun, Smith, Kaleb, Yu, Honghai, Lai, Yanzhao, Peng, Min, Nie, Jianyun, Suchow, Jordan W., Liu, Xiao-Yang, Wang, Benyou, Lopez-Lira, Alejandro, Huang, Jimin, Ananiadou, Sophia
Large language models (LLMs) have advanced financial applications, yet they often lack sufficient financial knowledge and struggle with tasks involving multi-modal inputs like tables and time series data. To address these limitations, we introduce \t
Externí odkaz:
http://arxiv.org/abs/2408.11878
Heterophilic Graph Neural Networks (HGNNs) have shown promising results for semi-supervised learning tasks on graphs. Notably, most real-world heterophilic graphs are composed of a mixture of nodes with different neighbor patterns, exhibiting local n
Externí odkaz:
http://arxiv.org/abs/2408.09490
Autor:
Wang, Yuxin, Feng, Duanyu, Dai, Yongfu, Chen, Zhengyu, Huang, Jimin, Ananiadou, Sophia, Xie, Qianqian, Wang, Hao
Data serves as the fundamental foundation for advancing deep learning, particularly tabular data presented in a structured format, which is highly conducive to modeling. However, even in the era of LLM, obtaining tabular data from sensitive domains r
Externí odkaz:
http://arxiv.org/abs/2408.02927