Zobrazeno 1 - 10
of 79 282
pro vyhledávání: '"Lau, P."'
Autor:
Lau, Gregory Kang Ruey, Niu, Xinyuan, Dao, Hieu, Chen, Jiangwei, Foo, Chuan-Sheng, Low, Bryan Kian Hsiang
Protecting intellectual property (IP) of text such as articles and code is increasingly important, especially as sophisticated attacks become possible, such as paraphrasing by large language models (LLMs) or even unauthorized training of LLMs on copy
Externí odkaz:
http://arxiv.org/abs/2407.04411
Speech contains information that is clinically relevant to some diseases, which has the potential to be used for health assessment. Recent work shows an interest in applying deep learning algorithms, especially pretrained large speech models to the a
Externí odkaz:
http://arxiv.org/abs/2407.00531
Autor:
Lau, Cheuk Fung
Let $f_1,\dots,f_k \in \mathbb{R}[X]$ be polynomials of degree at most $d$ with $f_1(0)=\dots=f_k(0)=0$. We show that there is an $n
Externí odkaz:
http://arxiv.org/abs/2407.01611
Factual consistency is an important quality in dialogue summarization. Large language model (LLM)-based automatic text summarization models generate more factually consistent summaries compared to those by smaller pretrained language models, but they
Externí odkaz:
http://arxiv.org/abs/2406.14709
Autor:
Xu, Xinyi, Wu, Zhaoxuan, Qiao, Rui, Verma, Arun, Shu, Yao, Wang, Jingtan, Niu, Xinyuan, He, Zhenfeng, Chen, Jiangwei, Zhou, Zijian, Lau, Gregory Kang Ruey, Dao, Hieu, Agussurja, Lucas, Sim, Rachael Hwee Ling, Lin, Xiaoqiang, Hu, Wenyang, Dai, Zhongxiang, Koh, Pang Wei, Low, Bryan Kian Hsiang
This position paper proposes a data-centric viewpoint of AI research, focusing on large language models (LLMs). We start by making the key observation that data is instrumental in the developmental (e.g., pretraining and fine-tuning) and inferential
Externí odkaz:
http://arxiv.org/abs/2406.14473
Modern deep neural networks often require distributed training with many workers due to their large size. As worker numbers increase, communication overheads become the main bottleneck in data-parallel minibatch stochastic gradient methods with per-i
Externí odkaz:
http://arxiv.org/abs/2406.13936
Planet formation models are necessary to understand the origins of diverse planetary systems. Circumstellar disc substructures have been proposed as preferred locations of planet formation but a complete formation scenario has not been covered by a s
Externí odkaz:
http://arxiv.org/abs/2406.12340
An important factor when it comes to generating fact-checking explanations is the selection of evidence: intuitively, high-quality explanations can only be generated given the right evidence. In this work, we investigate the impact of human-curated v
Externí odkaz:
http://arxiv.org/abs/2406.12645
Since rainy weather always degrades image quality and poses significant challenges to most computer vision-based intelligent systems, image de-raining has been a hot research topic. Fortunately, in a rainy light field (LF) image, background obscured
Externí odkaz:
http://arxiv.org/abs/2406.10652
We prove the Sato--Tate distribution of Kloosterman sums over function fields with explicit error terms, when the places vary in arithmetic progressions or short intervals. A joint Sato--Tate distribution of two ``different" exponential sums is also
Externí odkaz:
http://arxiv.org/abs/2406.10106