Zobrazeno 1 - 10
of 476
pro vyhledávání: '"WANG Yiquan"'
This paper puts forth an innovative approach that fuses deep learning, fractal analysis, and turbulence feature extraction techniques to create abstract artworks in the style of Pollock. The content and style characteristics of the image are extracte
Externí odkaz:
http://arxiv.org/abs/2410.20519
Autor:
Wang, Yiquan
The route planning problem based on the greedy algorithm represents a method of identifying the optimal or near-optimal route between a given start point and end point. In this paper, the PCA method is employed initially to downscale the city evaluat
Externí odkaz:
http://arxiv.org/abs/2410.13226
In order to enhance students' initiative and participation in MOOC learning, this study constructed a multi-level network model based on Social Network Analysis (SNA). The model makes use of data pertaining to nearly 40,000 users and tens of thousand
Externí odkaz:
http://arxiv.org/abs/2410.10658
This paper introduces a novel bionic intelligent optimisation algorithm, Octopus Inspired Optimization (OIO) algorithm, which is inspired by the neural structure of octopus, especially its hierarchical and decentralised interaction properties. By sim
Externí odkaz:
http://arxiv.org/abs/2410.07968
OAM spectrum reflects the OAM component included in measured light field which is crucial in OAM -based application. However, traditional definition-based OAM spectrum algorithm is extraordinary time-consuming and limited to prior knowledge severely.
Externí odkaz:
http://arxiv.org/abs/2409.06430
Quantum computing has gained considerable attention, especially after the arrival of the Noisy Intermediate-Scale Quantum (NISQ) era. Quantum processors and cloud services have been made world-wide increasingly available. Unfortunately, programs on e
Externí odkaz:
http://arxiv.org/abs/2404.07882
Autor:
Yang, Ke, Liu, Jiateng, Wu, John, Yang, Chaoqi, Fung, Yi R., Li, Sha, Huang, Zixuan, Cao, Xu, Wang, Xingyao, Wang, Yiquan, Ji, Heng, Zhai, Chengxiang
The prominent large language models (LLMs) of today differ from past language models not only in size, but also in the fact that they are trained on a combination of natural language and formal language (code). As a medium between humans and computer
Externí odkaz:
http://arxiv.org/abs/2401.00812
Autor:
Wang, Yiquan, Lv, Huibin, Teo, Qi Wen, Lei, Ruipeng, Gopal, Akshita B., Ouyang, Wenhao O., Yeung, Yuen-Hei, Tan, Timothy J.C., Choi, Danbi, Shen, Ivana R., Chen, Xin, Graham, Claire S., Wu, Nicholas C.
Publikováno v:
In Immunity 8 October 2024 57(10):2453-2465
Publikováno v:
In Tribology International July 2024 195
Publikováno v:
In Optics and Laser Technology February 2024 169