Zobrazeno 1 - 10
of 2 611
pro vyhledávání: '"Xin, Chun"'
Merging models becomes a fundamental procedure in some applications that consider model efficiency and robustness. The training randomness or Non-I.I.D. data poses a huge challenge for averaging-based model fusion. Previous research efforts focus on
Externí odkaz:
http://arxiv.org/abs/2408.12237
Knowledge Distillation (KD) could transfer the ``dark knowledge" of a well-performed yet large neural network to a weaker but lightweight one. From the view of output logits and softened probabilities, this paper goes deeper into the dark knowledge p
Externí odkaz:
http://arxiv.org/abs/2405.13078
The loss landscape of deep neural networks (DNNs) is commonly considered complex and wildly fluctuated. However, an interesting observation is that the loss surfaces plotted along Gaussian noise directions are almost v-basin ones with the perturbed m
Externí odkaz:
http://arxiv.org/abs/2405.12493
Exploring the loss landscape offers insights into the inherent principles of deep neural networks (DNNs). Recent work suggests an additional asymmetry of the valley beyond the flat and sharp ones, yet without thoroughly examining its causes or implic
Externí odkaz:
http://arxiv.org/abs/2405.12489
Autor:
Li, Xin-Chun, Song, Shaoming, Li, Yinchuan, Li, Bingshuai, Shao, Yunfeng, Yang, Yang, Zhan, De-Chuan
In some real-world applications, data samples are usually distributed on local devices, where federated learning (FL) techniques are proposed to coordinate decentralized clients without directly sharing users' private data. FL commonly follows the pa
Externí odkaz:
http://arxiv.org/abs/2404.09232
Due to the advantages of leveraging unlabeled data and learning meaningful representations, semi-supervised learning and contrastive learning have been progressively combined to achieve better performances in popular applications with few labeled dat
Externí odkaz:
http://arxiv.org/abs/2312.09598
Autor:
Shao-Feng Duan, Ji-Chen Yu, Timothy Charles Baldwin, Yuan Yuan, Gui-Sheng Xiang, Rui Cui, Yan Zhao, Xin-Chun Mo, Ying-Chun Lu, Yan-Li Liang
Publikováno v:
BMC Plant Biology, Vol 24, Iss 1, Pp 1-18 (2024)
Abstract Background MADS-box transcription factors have been shown to be involved in multiple developmental processes, including the regulation of floral organ formation and pollen maturation. However, the role of the MADS-box gene family in floral d
Externí odkaz:
https://doaj.org/article/d079bcf0df4b44b69124e1605d03b317
Autor:
Ang, Guang Jun Nicholas, Goil, Aritejh Kr, Chan, Henryk, Lew, Jieyi Jeric, Lee, Xin Chun, Mustaffa, Raihan Bin Ahmad, Jason, Timotius, Woon, Ze Ting, Shen, Bingquan
In a landscape characterized by heightened connectivity and mobility, coupled with a surge in cardiovascular ailments, the imperative to curtail healthcare expenses through remote monitoring of cardiovascular health has become more pronounced. The ac
Externí odkaz:
http://arxiv.org/abs/2305.16727
We consider a real-world scenario in which a newly-established pilot project needs to make inferences for newly-collected data with the help of other parties under privacy protection policies. Current federated learning (FL) paradigms are devoted to
Externí odkaz:
http://arxiv.org/abs/2305.04201
Autor:
Li, Xin-Chun, Fan, Wen-Shu, Song, Shaoming, Li, Yinchuan, Li, Bingshuai, Shao, Yunfeng, Zhan, De-Chuan
Knowledge Distillation (KD) aims at transferring the knowledge of a well-performed neural network (the {\it teacher}) to a weaker one (the {\it student}). A peculiar phenomenon is that a more accurate model doesn't necessarily teach better, and tempe
Externí odkaz:
http://arxiv.org/abs/2210.04427