Zobrazeno 1 - 10
of 173
pro vyhledávání: '"Gao, Hongyang"'
We introduce G2T-LLM, a novel approach for molecule generation that uses graph-to-tree text encoding to transform graph-based molecular structures into a hierarchical text format optimized for large language models (LLMs). This encoding converts comp
Externí odkaz:
http://arxiv.org/abs/2410.02198
Autor:
Yu, Zhaoning, Gao, Hongyang
Graph Neural Networks (GNNs) have shown remarkable success in molecular tasks, yet their interpretability remains challenging. Traditional model-level explanation methods like XGNN and GNNInterpreter often fail to identify valid substructures like ri
Externí odkaz:
http://arxiv.org/abs/2405.12519
Deep learning models are trained with certain assumptions about the data during the development stage and then used for prediction in the deployment stage. It is important to reason about the trustworthiness of the model's predictions with unseen dat
Externí odkaz:
http://arxiv.org/abs/2401.14628
Autor:
Yu, Zhaoning, Gao, Hongyang
Motif extraction is an important task in motif based molecular representation learning. Previously, machine learning approaches employing either rule-based or string-based techniques to extract motifs. Rule-based approaches may extract motifs that ar
Externí odkaz:
http://arxiv.org/abs/2312.15387
Neural networks with wide layers have attracted significant attention due to their equivalence to Gaussian processes, enabling perfect fitting of training data while maintaining generalization performance, known as benign overfitting. However, existi
Externí odkaz:
http://arxiv.org/abs/2310.10767
Deep learning-based vulnerability detection has shown great performance and, in some studies, outperformed static analysis tools. However, the highest-performing approaches use token-based transformer models, which are not the most efficient to captu
Externí odkaz:
http://arxiv.org/abs/2212.08108
Autor:
Gao, Tianxiang, Gao, Hongyang
Implicit neural networks have become increasingly attractive in the machine learning community since they can achieve competitive performance but use much less computational resources. Recently, a line of theoretical works established the global conv
Externí odkaz:
http://arxiv.org/abs/2209.15562
Autor:
Gao, Tianxiang, Gao, Hongyang
Implicit deep learning has recently become popular in the machine learning community since these implicit models can achieve competitive performance with state-of-the-art deep networks while using significantly less memory and computational resources
Externí odkaz:
http://arxiv.org/abs/2205.07463
Publikováno v:
In Journal of Energy Storage 10 October 2024 99 Part B
Publikováno v:
In Applied Surface Science 1 October 2024 669