Edge Entropy as an Indicator of the Effectiveness of GNNs over CNNs for Node Classification

Autor: Mark Cheung, Jose M. F. Moura, Lavender Yao Jiang, John Shi, Oren Wright
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Signal Processing (eess.SP)
FOS: Computer and information sciences
Computer Science - Machine Learning
Computer science
Computer Science::Neural and Evolutionary Computation
Structure (category theory)
010103 numerical & computational mathematics
02 engineering and technology
01 natural sciences
Convolutional neural network
Machine Learning (cs.LG)
FOS: Electrical engineering
electronic engineering
information engineering

0202 electrical engineering
electronic engineering
information engineering

Entropy (information theory)
0101 mathematics
Electrical Engineering and Systems Science - Signal Processing
business.industry
Deep learning
Node (networking)
020206 networking & telecommunications
Pattern recognition
Graph (abstract data type)
Enhanced Data Rates for GSM Evolution
Artificial intelligence
Performance improvement
business
Zdroj: ACSSC
Popis: Graph neural networks (GNNs) extend convolutional neural networks (CNNs) to graph-based data. A question that arises is how much performance improvement does the underlying graph structure in the GNN provide over the CNN (that ignores this graph structure). To address this question, we introduce edge entropy and evaluate how good an indicator it is for possible performance improvement of GNNs over CNNs. Our results on node classification with synthetic and real datasets show that lower values of edge entropy predict larger expected performance gains of GNNs over CNNs, and, conversely, higher edge entropy leads to expected smaller improvement gains.
Databáze: OpenAIRE