Zobrazeno 1 - 10
of 218
pro vyhledávání: '"Lu, Chenguang"'
Autor:
Lu, Chenguang
Recent advances in deep learning suggest that we need to maximize and minimize two different kinds of information simultaneously. The Information Max-Min (IMM) method has been used in deep learning, reinforcement learning, and maximum entropy control
Externí odkaz:
http://arxiv.org/abs/2411.05789
Autor:
Lu, Chenguang
The Variational Bayesian method (VB) is used to solve the probability distributions of latent variables with the minimum free energy criterion. This criterion is not easy to understand, and the computation is complex. For these reasons, this paper pr
Externí odkaz:
http://arxiv.org/abs/2408.13122
Autor:
Lu, Chenguang
A new trend in deep learning, represented by Mutual Information Neural Estimation (MINE) and Information Noise Contrast Estimation (InfoNCE), is emerging. In this trend, similarity functions and Estimated Mutual Information (EMI) are used as learning
Externí odkaz:
http://arxiv.org/abs/2305.14397
Autor:
Lu, Chenguang
Publikováno v:
Entropy, 2023,25(1), 143
When we compare the influences of two causes on an outcome, if the conclusion from every group is against that from the conflation, we think there is Simpson's Paradox. The Existing Causal Inference Theory (ECIT) can make the overall conclusion consi
Externí odkaz:
http://arxiv.org/abs/2302.09067
Autor:
Lu, Chenguang
To improve communication efficiency and provide more useful information, we need to measure semantic information by combining inaccuracy or distortion, freshness, purposiveness, and efficiency. The author proposed the semantic information G measure b
Externí odkaz:
http://arxiv.org/abs/2304.13502
Autor:
Lu, Chenguang
Publikováno v:
Entropy 2021, 23, 1050
In the rate-distortion function and the Maximum Entropy (ME) method, Minimum Mutual In-formation (MMI) distributions and ME distributions are expressed by Bayes-like formulas, in-cluding Negative Exponential Functions (NEFs) and partition functions.
Externí odkaz:
http://arxiv.org/abs/2110.07769
Autor:
Lu, Chenguang
Why can the Expectation-Maximization (EM) algorithm for mixture models converge? Why can different initial parameters cause various convergence difficulties? The Q-L synchronization theory explains that the observed data log-likelihood L and the comp
Externí odkaz:
http://arxiv.org/abs/2104.12592
Autor:
Lu, Chenguang
Publikováno v:
Philosophies 2020, 5(4), 25
Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same
Externí odkaz:
http://arxiv.org/abs/2011.00992
Autor:
Lu, Chenguang
The popular convergence theory of the EM algorithm explains that the observed incomplete data log-likelihood L and the complete data log-likelihood Q are positively correlated, and we can maximize L by maximizing Q. The Deterministic Annealing EM (DA
Externí odkaz:
http://arxiv.org/abs/2007.12845
Autor:
Lu, Chenguang
After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple discovered the Raven Paradox (RP). Then, Carnap used the logi
Externí odkaz:
http://arxiv.org/abs/2001.07566