Improving the Expressiveness of K-hop Message-Passing GNNs by Injecting Contextualized Substructure Information

Document Type

Conference Proceeding

Publication Title

Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining

Abstract

Graph neural networks (GNNs) have become the de facto standard for representational learning in graphs, and have achieved state-of-the-art performance in many graph-related tasks; however, it has been shown that the expressive power of standard GNNs are equivalent maximally to 1-dimensional Weisfeiler-Lehman (1-WL) Test. Recently, there is a line of works aiming to enhance the expressive power of graph neural networks. One line of such works aim at developing K-hop message-passing GNNs where node representation is updated by aggregating information from not only direct neighbors but all neighbors within K-hop of the node. Another line of works leverages subgraph information to enhance the expressive power which is proven to be strictly more powerful than 1-WL test. In this work, we discuss the limitation of K-hop message-passing GNNs and propose substructure encoding function to uplift the expressive power of any K-hop message-passing GNN. We further inject contextualized substructure information to enhance the expressiveness of K-hop message-passing GNNs. Our method is provably more powerful than previous works on K-hop graph neural networks and 1-WL subgraph GNNs, which is a specific type of subgraph based GNN models, and not less powerful than 3-WL. Empirically, our proposed method set new state-of-the-art performance or achieves comparable performance for a variety of datasets. Our code is available at https://github.com/tianyao-aka/Expresive-K-hop-GNNs.

First Page

3070

Last Page

3081

DOI

10.1145/3580305.3599390

Publication Date

8-4-2023

Keywords

expressive power of gnns, graph classi-fication, graph neural networks, graph regression

Comments

IR conditions: non-described

Share

COinS