Publication Type
Journal Article
Version
acceptedVersion
Publication Date
3-2024
Abstract
Recently, there has been increasing interest in the challenge of how to discriminatively vectorize graphs. To address this, we propose a method called Iterative Graph Self-Distillation (IGSD) which learns graph-level representation in an unsupervised manner through instance discrimination using a self-supervised contrastive learning approach. IGSD involves a teacher-student distillation process that uses graph diffusion augmentations and constructs the teacher model using an exponential moving average of the student model. The intuition behind IGSD is to predict the teacher network representation of the graph pairs under different augmented views. As a natural extension, we also apply IGSD to semi-supervised scenarios by jointly regularizing the network with both supervised and self-supervised contrastive loss. Finally, we show that finetuning the IGSD-trained models with self-training can further improve the graph representation power. Empirically, we achieve significant and consistent performance gain on various graph datasets in both unsupervised and semi-supervised settings, which well validates the superiority of IGSD.
Keywords
graph representation learning, self-supervised learning, contrastive learning
Discipline
Graphics and Human Computer Interfaces
Research Areas
Intelligent Systems and Optimization
Areas of Excellence
Digital transformation
Publication
IEEE Transactions on Knowledge and Data Engineering
Volume
36
Issue
3
First Page
1161
Last Page
1169
ISSN
1041-4347
Identifier
10.1109/TKDE.2023.3303885
Publisher
Institute of Electrical and Electronics Engineers
Citation
ZHANG, Hanlin; LIN, Shuai; LIU, Weiyang; ZHOU, Pan; TANG, Jian; LIANG, Xiaodan; and XING, Eric.
Iterative graph self-distillation. (2024). IEEE Transactions on Knowledge and Data Engineering. 36, (3), 1161-1169.
Available at: https://ink.library.smu.edu.sg/sis_research/8992
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1109/TKDE.2023.3303885