Zobrazeno 1 - 10
of 194
pro vyhledávání: '"Chen, ZhiKai"'
Autor:
Liu, Jingzhe, Mao, Haitao, Chen, Zhikai, Fan, Wenqi, Ju, Mingxuan, Zhao, Tong, Shah, Neil, Tang, Jiliang
Graph Neural Networks (GNNs) have emerged as a powerful tool to capture intricate network patterns, achieving success across different domains. However, existing GNNs require careful domain-specific architecture designs and training from scratch on e
Externí odkaz:
http://arxiv.org/abs/2412.00315
Autor:
Yu, Longxuan, Chen, Delin, Xiong, Siheng, Wu, Qingyang, Liu, Qingzhen, Li, Dawei, Chen, Zhikai, Liu, Xiaoze, Pan, Liangming
Causal reasoning (CR) is a crucial aspect of intelligence, essential for problem-solving, decision-making, and understanding the world. While large language models (LLMs) can generate rationales for their outputs, their ability to reliably perform ca
Externí odkaz:
http://arxiv.org/abs/2410.16676
Large Language Models (LLMs) have demonstrated remarkable performance across various natural language processing tasks. Recently, several LLMs-based pipelines have been developed to enhance learning on graphs with text attributes, showcasing promisin
Externí odkaz:
http://arxiv.org/abs/2407.12068
Autor:
Song, Yu, Mao, Haitao, Xiao, Jiachen, Liu, Jingzhe, Chen, Zhikai, Jin, Wei, Yang, Carl, Tang, Jiliang, Liu, Hui
Pretraining plays a pivotal role in acquiring generalized knowledge from large-scale data, achieving remarkable successes as evidenced by large models in CV and NLP. However, progress in the graph domain remains limited due to fundamental challenges
Externí odkaz:
http://arxiv.org/abs/2406.13873
Autor:
Chen, Zhikai, Mao, Haitao, Liu, Jingzhe, Song, Yu, Li, Bingheng, Jin, Wei, Fatemi, Bahare, Tsitsulin, Anton, Perozzi, Bryan, Liu, Hui, Tang, Jiliang
Given the ubiquity of graph data and its applications in diverse domains, building a Graph Foundation Model (GFM) that can work well across different graphs and tasks with a unified backbone has recently garnered significant interests. A major obstac
Externí odkaz:
http://arxiv.org/abs/2406.10727
Autor:
Fan, Wenqi, Wang, Shijie, Huang, Jiani, Chen, Zhikai, Song, Yu, Tang, Wenzhuo, Mao, Haitao, Liu, Hui, Liu, Xiaorui, Yin, Dawei, Li, Qing
Graphs play an important role in representing complex relationships in various domains like social networks, knowledge graphs, and molecular discovery. With the advent of deep learning, Graph Neural Networks (GNNs) have emerged as a cornerstone in Gr
Externí odkaz:
http://arxiv.org/abs/2404.14928
Diffusion models are just at a tipping point for image super-resolution task. Nevertheless, it is not trivial to capitalize on diffusion models for video super-resolution which necessitates not only the preservation of visual appearance from low-reso
Externí odkaz:
http://arxiv.org/abs/2403.17000
Session-based recommendation has gained increasing attention in recent years, with its aim to offer tailored suggestions based on users' historical behaviors within sessions. To advance this field, a variety of methods have been developed, with ID-ba
Externí odkaz:
http://arxiv.org/abs/2402.08921
Autor:
Mao, Haitao, Chen, Zhikai, Tang, Wenzhuo, Zhao, Jianan, Ma, Yao, Zhao, Tong, Shah, Neil, Galkin, Mikhail, Tang, Jiliang
Graph Foundation Models (GFMs) are emerging as a significant research topic in the graph domain, aiming to develop graph models trained on extensive and diverse data to enhance their applicability across various tasks and domains. Developing GFMs pre
Externí odkaz:
http://arxiv.org/abs/2402.02216
Deep graph models (e.g., graph neural networks and graph transformers) have become important techniques for leveraging knowledge across various types of graphs. Yet, the neural scaling laws on graphs, i.e., how the performance of deep graph models ch
Externí odkaz:
http://arxiv.org/abs/2402.02054