Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
2-2023
Abstract
Subgraph isomorphism counting is an important problem on graphs, as many graph-based tasks exploit recurring subgraph patterns. Classical methods usually boil down to a backtracking framework that needs to navigate a huge search space with prohibitive computational costs. Some recent studies resort to graph neural networks (GNNs) to learn a low-dimensional representation for both the query and input graphs, in order to predict the number of subgraph isomorphisms on the input graph. However, typical GNNs employ a node-centric message passing scheme that receives and aggregates messages on nodes, which is inadequate in complex structure matching for isomorphism counting. Moreover, on an input graph, the space of possible query graphs is enormous, and different parts of the input graph will be triggered to match different queries. Thus, expecting a fixed representation of the input graph to match diversely structured query graphs is unrealistic. In this paper, we propose a novel GNN called Count-GNN for subgraph isomorphism counting, to deal with the above challenges. At the edge level, given that an edge is an atomic unit of encoding graph structures, we propose an edge-centric message passing scheme, where messages on edges are propagated and aggregated based on the edge adjacency to preserve fine-grained structural information. At the graph level, we modulate the input graph representation conditioned on the query, so that the input graph can be adapted to each query individually to improve their matching. Finally, we conduct extensive experiments on a number of benchmark datasets to demonstrate the superior performance of Count-GNN.
Keywords
Classical methods, Computational costs, Graph neural networks, Graph-based, Input graphs, Message-passing, Query graph, Search spaces, Subgraph isomorphism, Subgraphs
Discipline
Information Security
Publication
Proceedings of the 37th AAAI Conference on Artificial Intelligence, Washington, USA, 2023 February 7-14
First Page
4845
Last Page
4853
Identifier
10.48550/arXiv.2302.03266
Publisher
AAAI
City or Country
Washington
Citation
YU, Xingtong; LIU, Zemin; FANG, Yuan; and ZHANG, Xinming.
Learning to count isomorphisms with graph neural networks. (2023). Proceedings of the 37th AAAI Conference on Artificial Intelligence, Washington, USA, 2023 February 7-14. 4845-4853.
Available at: https://ink.library.smu.edu.sg/sis_research/8178
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
http://doi.org/10.48550/arXiv.2302.03266