RGI : Regularized Graph Infomax for self-supervised learning on graphs

Autor: Pina, Oscar, Vilaplana, Verónica
Rok vydání: 2023
Předmět:
DOI: 10.48550/arxiv.2303.08644
Popis: Self-supervised learning is gaining considerable attention as a solution to avoid the requirement of extensive annotations in representation learning on graphs. We introduce \textit{Regularized Graph Infomax (RGI)}, a simple yet effective framework for node level self-supervised learning on graphs that trains a graph neural network encoder by maximizing the mutual information between node level local and global views, in contrast to previous works that employ graph level global views. The method promotes the predictability between views while regularizing the covariance matrices of the representations. Therefore, RGI is non-contrastive, does not depend on complex asymmetric architectures nor training tricks, is augmentation-free and does not rely on a two branch architecture. We run RGI on both transductive and inductive settings with popular graph benchmarks and show that it can achieve state-of-the-art performance regardless of its simplicity.
Comment: 11 pages, 1 figure, preprint
Databáze: OpenAIRE