Zobrazeno 1 - 10
of 797
pro vyhledávání: '"E Weinan"'
Autor:
Shi, Yaorui, Li, Sihang, Zhang, Taiyan, Fang, Xi, Wang, Jiankun, Liu, Zhiyuan, Zhao, Guojiang, Zhu, Zhengdan, Gao, Zhifeng, Zhong, Renxin, Zhang, Linfeng, Ke, Guolin, E, Weinan, Cai, Hengxing, Wang, Xiang
Automated drug discovery offers significant potential for accelerating the development of novel therapeutics by substituting labor-intensive human workflows with machine-driven processes. However, a critical bottleneck persists in the inability of cu
Externí odkaz:
http://arxiv.org/abs/2412.07819
Transformers have demonstrated exceptional in-context learning capabilities, yet the theoretical understanding of the underlying mechanisms remain limited. A recent work (Elhage et al., 2021) identified a "rich" in-context mechanism known as inductio
Externí odkaz:
http://arxiv.org/abs/2410.11474
We report an extensive molecular dynamics study of ab-initio quality of the ferroelectric phase transition in crystalline PbTiO3. We model anharmonicity accurately in terms of potential energy and polarization surfaces trained on density functional t
Externí odkaz:
http://arxiv.org/abs/2410.06414
Autor:
Xu, Fanjie, Guo, Wentao, Wang, Feng, Yao, Lin, Wang, Hongshuai, Tang, Fujie, Gao, Zhifeng, Zhang, Linfeng, E, Weinan, Tian, Zhong-Qun, Cheng, Jun
The study of structure-spectrum relationships is essential for spectral interpretation, impacting structural elucidation and material design. Predicting spectra from molecular structures is challenging due to their complex relationships. Herein, we i
Externí odkaz:
http://arxiv.org/abs/2408.15681
Autor:
Zeng, Boshen, Chen, Sian, Liu, Xinxin, Chen, Changhong, Deng, Bin, Wang, Xiaoxu, Gao, Zhifeng, Zhang, Yuzhi, E, Weinan, Zhang, Linfeng
Advancements in lithium battery technology heavily rely on the design and engineering of electrolytes. However, current schemes for molecular design and recipe optimization of electrolytes lack an effective computational-experimental closed loop and
Externí odkaz:
http://arxiv.org/abs/2407.06152
Autor:
Yang, Hongkang, Lin, Zehao, Wang, Wenjin, Wu, Hao, Li, Zhiyu, Tang, Bo, Wei, Wenqiang, Wang, Jinbo, Tang, Zeyun, Song, Shichao, Xi, Chenyang, Yu, Yu, Chen, Kai, Xiong, Feiyu, Tang, Linpeng, E, Weinan
The training and inference of large language models (LLMs) are together a costly process that transports knowledge from raw data to meaningful computation. Inspired by the memory hierarchy of the human brain, we reduce this cost by equipping LLMs wit
Externí odkaz:
http://arxiv.org/abs/2407.01178
In recent years, pretraining models have made significant advancements in the fields of natural language processing (NLP), computer vision (CV), and life sciences. The significant advancements in NLP and CV are predominantly driven by the expansion o
Externí odkaz:
http://arxiv.org/abs/2406.14969
Autor:
Wang, Mingze, Wang, Jinbo, He, Haotian, Wang, Zilin, Huang, Guanhua, Xiong, Feiyu, Li, Zhiyu, E, Weinan, Wu, Lei
In this work, we propose an Implicit Regularization Enhancement (IRE) framework to accelerate the discovery of flat solutions in deep learning, thereby improving generalization and convergence. Specifically, IRE decouples the dynamics of flat and sha
Externí odkaz:
http://arxiv.org/abs/2405.20763
A data-driven ab initio generalized Langevin equation (AIGLE) approach is developed to learn and simulate high-dimensional, heterogeneous, coarse-grained conformational dynamics. Constrained by the fluctuation-dissipation theorem, the approach can bu
Externí odkaz:
http://arxiv.org/abs/2405.12356
Autor:
Wang, Mingze, E, Weinan
We conduct a systematic study of the approximation properties of Transformer for sequence modeling with long, sparse and complicated memory. We investigate the mechanisms through which different components of Transformer, such as the dot-product self
Externí odkaz:
http://arxiv.org/abs/2402.00522