Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

8-2024

Abstract

Graph neural networks have shown widespread success for learning on graphs, but they still face fundamental drawbacks, such as limited expressive power, over-smoothing, and over-squashing. Meanwhile, the transformer architecture offers a potential solution to these issues. However, existing graph transformers primarily cater to homogeneous graphs and are unable to model the intricate semantics of heterogeneous graphs. Moreover, unlike small molecular graphs where the entire graph can be considered as the receptive field in graph transformers, real-world heterogeneous graphs comprise a significantly larger number of nodes and cannot be entirely treated as such. Consequently, existing graph transformers struggle to capture the long-range dependencies in these complex heterogeneous graphs. To address these two limitations, we present Poly-tokenized Heterogeneous Graph Transformer (PHGT), a novel transformer-based heterogeneous graph model. In addition to traditional node tokens, PHGT introduces a novel poly-token design with two more token types: semantic tokens and global tokens. Semantic tokens encapsulate high-order heterogeneous semantic relationships, while global tokens capture semantic-aware long-range interactions. We validate the effectiveness of PHGT through extensive experiments on standardized heterogeneous graph benchmarks, demonstrating significant improvements over state-of-the-art heterogeneous graph representation learning models.

Keywords

Data Mining, Mining heterogenous data, Mining graphs, Graph transformer, Graphic model

Discipline

Artificial Intelligence and Robotics | Graphics and Human Computer Interfaces

Research Areas

Intelligent Systems and Optimization

Areas of Excellence

Digital transformation

Publication

Proceedings of the 33rd International Joint Conference on Artificial Intelligence (IJCAI 2024) : Jeju, South Korea, August 3-9

First Page

2234

Last Page

2242

Identifier

10.24963/ijcai.2024/247

Publisher

International Joint Conferences on Artificial Intelligence

City or Country

Jeju, Korea

Comments

PDF provided by faculty

Additional URL

https://doi.org/10.24963/ijcai.2024/247

Share

COinS