Zobrazeno 1 - 10
of 6 611
pro vyhledávání: '"Dongqi An"'
Autor:
Hao Yu, Daojing Gan, Zhen Luo, Qilin Yang, Dongqi An, Hao Zhang, Yingchun Hu, Zhuang Ma, Qingchun Zeng, Dingli Xu, Hao Ren
Publikováno v:
Molecular Medicine, Vol 30, Iss 1, Pp 1-14 (2024)
Abstract Background In heart failure (HF), mitochondrial dysfunction and metabolic remodeling lead to a reduction in energy productivity and aggravate cardiomyocyte injury. Supplementation with α-ketoglutarate (AKG) alleviated myocardial hypertrophy
Externí odkaz:
https://doaj.org/article/c690094507a04a0e8f2266ae076e04fa
Publikováno v:
Micromachines, Vol 15, Iss 3, p 371 (2024)
In recent years, global attention towards new energy has surged due to increasing energy demand and environmental concerns. Researchers have intensified their focus on new energy, leading to advancements in technologies like triboelectrification, whi
Externí odkaz:
https://doaj.org/article/5dcaccd7831043a7b1f875c3c949844e
Autor:
Wang, Limei, Hassani, Kaveh, Zhang, Si, Fu, Dongqi, Yuan, Baichuan, Cong, Weilin, Hua, Zhigang, Wu, Hao, Yao, Ning, Long, Bo
Transformers serve as the backbone architectures of Foundational Models, where a domain-specific tokenizer helps them adapt to various domains. Graph Transformers (GTs) have recently emerged as a leading model in geometric deep learning, outperformin
Externí odkaz:
http://arxiv.org/abs/2410.13798
Graphs have been widely used in the past decades of big data and AI to model comprehensive relational data. When analyzing a graph's statistical properties, graph laws serve as essential tools for parameterizing its structure. Identifying meaningful
Externí odkaz:
http://arxiv.org/abs/2410.12126
Autor:
Iacob, Alex, Sani, Lorenzo, Kurmanji, Meghdad, Shen, William F., Qiu, Xinchi, Cai, Dongqi, Gao, Yan, Lane, Nicholas D.
Language Model pre-training benefits from a broader data mixture to enhance performance across domains and languages. However, training on such heterogeneous text corpora is complex, requiring extensive and cost-intensive efforts. Since these data so
Externí odkaz:
http://arxiv.org/abs/2410.05021
Autor:
Xu, Zhe, Hassani, Kaveh, Zhang, Si, Zeng, Hanqing, Yasunaga, Michihiro, Wang, Limei, Fu, Dongqi, Yao, Ning, Long, Bo, Tong, Hanghang
Language Models (LMs) are increasingly challenging the dominance of domain-specific models, including Graph Neural Networks (GNNs) and Graph Transformers (GTs), in graph learning tasks. Following this trend, we propose a novel approach that empowers
Externí odkaz:
http://arxiv.org/abs/2410.02296
Multivariate Time Series (MTS) forecasting is a fundamental task with numerous real-world applications, such as transportation, climate, and epidemiology. While a myriad of powerful deep learning models have been developed for this task, few works ha
Externí odkaz:
http://arxiv.org/abs/2410.02195
Autor:
Lu, Zhenyan, Li, Xiang, Cai, Dongqi, Yi, Rongjie, Liu, Fangming, Zhang, Xiwen, Lane, Nicholas D., Xu, Mengwei
Small language models (SLMs), despite their widespread adoption in modern smart devices, have received significantly less academic attention compared to their large language model (LLM) counterparts, which are predominantly deployed in data centers a
Externí odkaz:
http://arxiv.org/abs/2409.15790
Autor:
Fan, Dongqi, Chen, Tao, Wang, Mingjie, Ma, Rui, Tang, Qiang, Yi, Zili, Wang, Qian, Chang, Liang
Current Pose-Guided Person Image Synthesis (PGPIS) methods depend heavily on large amounts of labeled triplet data to train the generator in a supervised manner. However, they often falter when applied to in-the-wild samples, primarily due to the dis
Externí odkaz:
http://arxiv.org/abs/2409.09593
Human memory is inherently prone to forgetting. To address this, multimodal embedding models have been introduced, which transform diverse real-world data into a unified embedding space. These embeddings can be retrieved efficiently, aiding mobile us
Externí odkaz:
http://arxiv.org/abs/2409.15342