Zobrazeno 1 - 10
of 545
pro vyhledávání: '"ZHANG Honghua"'
Large Language Models (LLMs) are typically shipped with tokenizers that deterministically encode text into so-called canonical token sequences, to which the LLMs assign probability values. One common assumption is that the probability of a piece of t
Externí odkaz:
http://arxiv.org/abs/2408.08541
Publikováno v:
Nantong Daxue xuebao. Ziran kexue ban, Vol 20, Iss 4, Pp 1-14 (2021)
Besides the surface plasmon resonance characteristic of noble metal nanoparticles, the two-dimensional(2 D)noble metal nanoparticle ordered arrays can generate new or enhanced optical properties with good stability and reproducibility due to their hi
Externí odkaz:
https://doaj.org/article/42f3669f032345709fe997b2757a5af8
Despite the success of Large Language Models (LLMs) on various tasks following human instructions, controlling model generation at inference time poses a persistent challenge. In this paper, we introduce Ctrl-G, an adaptable framework that facilitate
Externí odkaz:
http://arxiv.org/abs/2406.13892
Publikováno v:
In Proceedings of the 40th Conference on Uncertainty in Artificial Intelligence (UAI), 2024
Probabilistic circuits compute multilinear polynomials that represent multivariate probability distributions. They are tractable models that support efficient marginal inference. However, various polynomial semantics have been considered in the liter
Externí odkaz:
http://arxiv.org/abs/2402.09085
Despite the success of autoregressive large language models in text generation, it remains a major challenge to generate text that satisfies complex constraints: sampling from the conditional distribution ${\Pr}(\text{text} | \alpha)$ is intractable
Externí odkaz:
http://arxiv.org/abs/2304.07438
Tree-shaped graphical models are widely used for their tractability. However, they unfortunately lack expressive power as they require committing to a particular sparse dependency structure. We propose a novel class of generative models called mixtur
Externí odkaz:
http://arxiv.org/abs/2302.14202
Probabilistic Circuits (PCs) are a unified framework for tractable probabilistic models that support efficient computation of various probabilistic queries (e.g., marginal probabilities). One key challenge is to scale PCs to model large and high-dime
Externí odkaz:
http://arxiv.org/abs/2210.04398
Logical reasoning is needed in a wide range of NLP tasks. Can a BERT model be trained end-to-end to solve logical reasoning problems presented in natural language? We attempt to answer this question in a confined problem space where there exists a se
Externí odkaz:
http://arxiv.org/abs/2205.11502
Autor:
Zhang, Zhongqing, Zhang, Honghua, Zhao, Junfeng, Liu, Yunfeng, Xie, Shengpeng, Han, Anjun, Zhang, Liping, Liu, Zhengxin, Liu, Wei
Publikováno v:
In Solar Energy Materials and Solar Cells 15 October 2024 277
Autor:
Zhang, Honghua, Liang, Shaoxian, Yin, Kewan, Mo, Yufeng, Li, Yamin, Lv, Yaning, Zhan, Hao, Zhang, Zhuang, Shan, Zhilei, Guo, Zhiguo, Yin, Shi, Yang, Wanshui
Publikováno v:
In The Journal of Nutrition September 2024 154(9):2843-2851