Zobrazeno 1 - 10
of 2 075
pro vyhledávání: '"Han Xiaodong"'
Autor:
Han Xiaodong
Publikováno v:
Applied Mathematics and Nonlinear Sciences, Vol 9, Iss 1 (2024)
This paper discusses the critical impact of informatization automation on plant automation, and takes a water plant’s coagulation dosing automation control system as an example to explore the value of integrated wireless modules in the automation o
Externí odkaz:
https://doaj.org/article/0a67931158f44718a26f8636a66c0990
Autor:
Han Xiaodong
Publikováno v:
Applied Mathematics and Nonlinear Sciences, Vol 9, Iss 1 (2024)
The article establishes the objective function of distributed electronic and electrical architecture from three dimensions of economy, quantization and loadability, and constructs a mathematical model for multi-objective optimization based on the sat
Externí odkaz:
https://doaj.org/article/6c80dd37678c413bad01493bdb3eddf9
Autor:
Long Haibo, Zhao Yunsong, Zhao Junbo, Yuan Xiaoyi, Fan Hao, Luo Yushi, Li Wei, An Zibing, Mao Shengcheng, Liu Gang, Han Xiaodong
Publikováno v:
National Science Open, Vol 3 (2023)
This study presents a design strategy to enhance the high-temperature creep resistance of Ni-based superalloys. This strategy focuses on two principles: (1) minimizing the dimensions of γ/γ′ interfaces and γ channels by reducing the size of the
Externí odkaz:
https://doaj.org/article/3f7f469ad9234cb0ae5d4d6c60e542f9
Autor:
Yang, Dongjie, Huang, Suyuan, Lu, Chengqiang, Han, Xiaodong, Zhang, Haoxin, Gao, Yan, Hu, Yao, Zhao, Hai
Advancements in multimodal learning, particularly in video understanding and generation, require high-quality video-text datasets for improved model performance. Vript addresses this issue with a meticulously annotated corpus of 12K high-resolution v
Externí odkaz:
http://arxiv.org/abs/2406.06040
Large Language Models (LLMs) have shown remarkable comprehension abilities but face challenges in GPU memory usage during inference, hindering their scalability for real-time applications like chatbots. To accelerate inference, we store computed keys
Externí odkaz:
http://arxiv.org/abs/2405.12532
Autor:
Yang, Luyan, Savchenko, Andrii S., Zheng, Fengshan, Kiselev, Nikolai S., Rybakov, Filipp N., Han, Xiaodong, Blügel, Stefan, Dunin-Borkowski, Rafal E.
Publikováno v:
Advanced Materials 2024, 2403274
Magnetic skyrmions are topologically nontrivial spin configurations that possess particle-like properties. Earlier research was mainly focused on a specific type of skyrmion with topological charge Q = -1. However, theoretical analyses of two-dimensi
Externí odkaz:
http://arxiv.org/abs/2403.16931
Autor:
Wang, Xiangtong, Han, Xiaodong, Yang, Menglong, Xing, Chuan, Wang, Yuqi, Han, Songchen, Li, Wei
Low Earth orbit (LEO) mega-constellations rely on inter-satellite links (ISLs) to provide global connectivity. We note that in addition to the general constellation parameters, the ISL spanning patterns are also greatly influence the final network st
Externí odkaz:
http://arxiv.org/abs/2312.15873
The aim of audio-visual segmentation (AVS) is to precisely differentiate audible objects within videos down to the pixel level. Traditional approaches often tackle this challenge by combining information from various modalities, where the contributio
Externí odkaz:
http://arxiv.org/abs/2308.08288
Autor:
Qin, Zhen, Li, Dong, Sun, Weigao, Sun, Weixuan, Shen, Xuyang, Han, Xiaodong, Wei, Yunshen, Lv, Baohong, Luo, Xiao, Qiao, Yu, Zhong, Yiran
We present TransNormerLLM, the first linear attention-based Large Language Model (LLM) that outperforms conventional softmax attention-based models in terms of both accuracy and efficiency. TransNormerLLM evolves from the previous linear attention ar
Externí odkaz:
http://arxiv.org/abs/2307.14995
Autor:
Qin, Zhen, Sun, Weixuan, Lu, Kaiyue, Deng, Hui, Li, Dongxu, Han, Xiaodong, Dai, Yuchao, Kong, Lingpeng, Zhong, Yiran
Relative positional encoding is widely used in vanilla and linear transformers to represent positional information. However, existing encoding methods of a vanilla transformer are not always directly applicable to a linear transformer, because the la
Externí odkaz:
http://arxiv.org/abs/2307.09270