Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

4-2024

Abstract

The Graph Transformer (GT) has shown significant ability in processing graph-structured data, addressing limitations in graph neural networks, such as over-smoothing and over-squashing. However, the implementation of GT in real-world heterogeneous graphs (HGs) with complex topology continues to present numerous challenges. Firstly, a challenge arises in designing a tokenizer that is compatible with heterogeneity. Secondly, the complexity of the transformer hampers the acquisition of high-order neighbor information in HGs. In this paper, we propose a novel Hop-basedHeterogeneous Graph Transformer (H2Gormer) framework, paving a promising path for HGs to benefit from the capabilities of Transformers. We propose a Heterogeneous Hop-based Token Generation module to obtain high-order information in a flexible way. Specifically, to enrich the fine-grained heterogeneous semantics of each token, we propose a tailored multi-relational encoder to encode thehop-based neighbors. In this way, the resulting token embeddings are input to the Hop-based Transformer to obtain node representations, which are then combined with position embeddings to obtain the final encoding. Extensive experiments on four datasets are conducted to demonstrate the effectiveness of H2Gormer.

Keywords

Graph Neural Networks, Heterogeneous Information Networks, Representation Learning, Graph Embedding, Graph Attention

Discipline

Artificial Intelligence and Robotics | Graphics and Human Computer Interfaces

Research Areas

Intelligent Systems and Optimization

Areas of Excellence

Digital transformation

Publication

WWW '20: The Web Conference 2020, Taipei Taiwan, April 20-24, 2020

First Page

2346

Last Page

2353

Identifier

10.3233/FAIA240759

Publisher

ACM

City or Country

New York

Additional URL

https://doi.org/10.1145/3366423.3380027

Share

COinS