Zobrazeno 1 - 10
of 10 749
pro vyhledávání: '"YAO, Jun"'
Autor:
Grosnit, Antoine, Maraval, Alexandre, Doran, James, Paolo, Giuseppe, Thomas, Albert, Beevi, Refinath Shahul Hameed Nabeezath, Gonzalez, Jonas, Khandelwal, Khyati, Iacobacci, Ignacio, Benechehab, Abdelhakim, Cherkaoui, Hamza, El-Hili, Youssef Attia, Shao, Kun, Hao, Jianye, Yao, Jun, Kegl, Balazs, Bou-Ammar, Haitham, Wang, Jun
We introduce Agent K v1.0, an end-to-end autonomous data science agent designed to automate, optimise, and generalise across diverse data science tasks. Fully automated, Agent K v1.0 manages the entire data science life cycle by learning from experie
Externí odkaz:
http://arxiv.org/abs/2411.03562
Autor:
Lin, Haoran, Yu, Xianzhi, Zhao, Kang, Hou, Lu, Zhan, Zongyuan, Kamenev, Stanislav, Bao, Han, Hu, Ting, Wang, Mingkai, Chang, Qixin, Sui, Siyue, Sun, Weihao, Hu, Jiaxin, Yao, Jun, Yin, Zekun, Qian, Cheng, Zhang, Ying, Pan, Yinfei, Yang, Yu, Liu, Weiguo
FlashAttention series has been widely applied in the inference of large language models (LLMs). However, FlashAttention series only supports the high-level GPU architectures, e.g., Ampere and Hopper. At present, FlashAttention series is not easily tr
Externí odkaz:
http://arxiv.org/abs/2410.16663
Autor:
Sun, Yuxuan, Liu, Ruikang, Bai, Haoli, Bao, Han, Zhao, Kang, Li, Yuening, Hu, Jiaxin, Yu, Xianzhi, Hou, Lu, Yuan, Chun, Jiang, Xin, Liu, Wulong, Yao, Jun
Recently, quantization has been widely used for the compression and acceleration of large language models~(LLMs). Due to the outliers in LLMs, it is crucial to flatten weights and activations to minimize quantization error with the equally spaced qua
Externí odkaz:
http://arxiv.org/abs/2410.09426
Autor:
Chen, Kai, Gou, Yunhao, Huang, Runhui, Liu, Zhili, Tan, Daxin, Xu, Jing, Wang, Chunwei, Zhu, Yi, Zeng, Yihan, Yang, Kuo, Wang, Dingdong, Xiang, Kun, Li, Haoyuan, Bai, Haoli, Han, Jianhua, Li, Xiaohui, Jin, Weike, Xie, Nian, Zhang, Yu, Kwok, James T., Zhao, Hengshuang, Liang, Xiaodan, Yeung, Dit-Yan, Chen, Xiao, Li, Zhenguo, Zhang, Wei, Liu, Qun, Yao, Jun, Hong, Lanqing, Hou, Lu, Xu, Hang
GPT-4o, an omni-modal model that enables vocal conversations with diverse emotions and tones, marks a milestone for omni-modal foundation models. However, empowering Large Language Models to perceive and generate images, texts, and speeches end-to-en
Externí odkaz:
http://arxiv.org/abs/2409.18042
The optical counterparts of skyrmions have recently been constructed with diverse topological types and by different degrees of freedom, such as field, spins, and Stokes vectors, exhibiting extensive potential in modern information science. However,
Externí odkaz:
http://arxiv.org/abs/2409.05689
Autor:
Shen, Kang, Liu, Xuxiong, Wang, Boyan, Yao, Jun, Liu, Xin, Guan, Yujie, Wang, Yu, Li, Gengchen, Sun, Xiao
In this paper, we present our approach to addressing the challenges of the 7th ABAW competition. The competition comprises three sub-challenges: Valence Arousal (VA) estimation, Expression (Expr) classification, and Action Unit (AU) detection. To tac
Externí odkaz:
http://arxiv.org/abs/2407.12258
Autor:
Liu, Xuxiong, Shen, Kang, Yao, Jun, Wang, Boyan, Liu, Minrui, An, Liuwei, Cui, Zishun, Feng, Weijie, Sun, Xiao
Compound Expression Recognition (CER) is vital for effective interpersonal interactions. Human emotional expressions are inherently complex due to the presence of compound expressions, requiring the consideration of both local and global facial cues
Externí odkaz:
http://arxiv.org/abs/2407.12257
Autor:
Kang, Jikun, Li, Xin Zhe, Chen, Xi, Kazemi, Amirreza, Sun, Qianyi, Chen, Boxing, Li, Dong, He, Xu, He, Quan, Wen, Feng, Hao, Jianye, Yao, Jun
Although Large Language Models (LLMs) achieve remarkable performance across various tasks, they often struggle with complex reasoning tasks, such as answering mathematical questions. Recent efforts to address this issue have primarily focused on leve
Externí odkaz:
http://arxiv.org/abs/2405.16265
Autor:
Liu, Ruikang, Bai, Haoli, Lin, Haokun, Li, Yuening, Gao, Han, Xu, Zhengzhuo, Hou, Lu, Yao, Jun, Yuan, Chun
Large language models (LLMs) excel in natural language processing but demand intensive computation. To mitigate this, various quantization methods have been explored, yet they compromise LLM performance. This paper unveils a previously overlooked typ
Externí odkaz:
http://arxiv.org/abs/2403.01241
Autor:
Ji, Qingchun, Yao, Jun
In this paper, we develop $L^2$ theory for Riemannian and Hermitian foliations on manifolds with basic boundary. We establish a decomposition theorem, various vanishing theorems, a twisted duality theorem for basic cohomologies and an extension theor
Externí odkaz:
http://arxiv.org/abs/2402.08196