Zobrazeno 1 - 10
of 610
pro vyhledávání: '"Wang Zhenyi"'
Autor:
Rong Xinqi, Wu Mingsheng, Xin Xuezhi, Zhang Bo, Liu Dianwen, Xiao Huirong, Wang Zhenyi, Cui Junhui, Wang Jianping, Wang Zhongcheng, Fan Xiaohua, Hu Ying, Rong Yisheng, Li Ying
Publikováno v:
Digital Chinese Medicine, Vol 6, Iss 4, Pp 467-476 (2023)
Objective: To compare the efficacy and safety of combining diosmin with Jiuhua hemorrhoid suppository versus diosmin alone for the treatment of hemorrhoid hemorrhage. Method: The Jiuhua hemorrhoid suppository study was conducted in 10 medical centers
Externí odkaz:
https://doaj.org/article/33e983e621a84840816e3f010fd91982
Autor:
Yang, Enneng, Shen, Li, Wang, Zhenyi, Guo, Guibing, Wang, Xingwei, Cao, Xiaocun, Zhang, Jie, Tao, Dacheng
Model merging-based multitask learning (MTL) offers a promising approach for performing MTL by merging multiple expert models without requiring access to raw training data. However, in this paper, we examine the merged model's representation distribu
Externí odkaz:
http://arxiv.org/abs/2410.14389
Large language models (LLMs) have recently demonstrated state-of-the-art performance across various natural language processing (NLP) tasks, achieving near-human levels in multiple language understanding challenges and aligning closely with the core
Externí odkaz:
http://arxiv.org/abs/2407.14112
The large-scale integration of intermittent renewable energy resources introduces increased uncertainty and volatility to the supply side of power systems, thereby complicating system operation and control. Recently, data-driven approaches, particula
Externí odkaz:
http://arxiv.org/abs/2407.00681
Data-Free Meta-Learning (DFML) aims to derive knowledge from a collection of pre-trained models without accessing their original data, enabling the rapid adaptation to new unseen tasks. Current methods often overlook the heterogeneity among pre-train
Externí odkaz:
http://arxiv.org/abs/2405.16560
Data-Free Meta-Learning (DFML) aims to extract knowledge from a collection of pre-trained models without requiring the original data, presenting practical benefits in contexts constrained by data privacy concerns. Current DFML methods primarily focus
Externí odkaz:
http://arxiv.org/abs/2405.00984
Continual Learning (CL) focuses on learning from dynamic and changing data distributions while retaining previously acquired knowledge. Various methods have been developed to address the challenge of catastrophic forgetting, including regularization-
Externí odkaz:
http://arxiv.org/abs/2403.13249
Few-Shot Class-Incremental Learning (FSCIL) models aim to incrementally learn new classes with scarce samples while preserving knowledge of old ones. Existing FSCIL methods usually fine-tune the entire backbone, leading to overfitting and hindering t
Externí odkaz:
http://arxiv.org/abs/2403.09857
Autor:
Yang, Enneng, Shen, Li, Wang, Zhenyi, Guo, Guibing, Chen, Xiaojun, Wang, Xingwei, Tao, Dacheng
Multi-task learning (MTL) compresses the information from multiple tasks into a unified backbone to improve computational efficiency and generalization. Recent work directly merges multiple independently trained models to perform MTL instead of colle
Externí odkaz:
http://arxiv.org/abs/2402.02705
Multi-task learning (MTL) aims to empower a model to tackle multiple tasks simultaneously. A recent development known as task arithmetic has revealed that several models, each fine-tuned for distinct tasks, can be directly merged into a single model
Externí odkaz:
http://arxiv.org/abs/2310.02575