Simple and Asymmetric Graph Contrastive Learning without Augmentations

Teng Xiao, Huaisheng Zhu, Zhengyu Chen, Suhang Wang

Research output: Contribution to journalConference articlepeer-review

7 Scopus citations

Abstract

Graph Contrastive Learning (GCL) has shown superior performance in representation learning in graph-structured data.Despite their success, most existing GCL methods rely on prefabricated graph augmentation and homophily assumptions.Thus, they fail to generalize well to heterophilic graphs where connected nodes may have different class labels and dissimilar features.In this paper, we study the problem of conducting contrastive learning on homophilic and heterophilic graphs.We find that we can achieve promising performance simply by considering an asymmetric view of the neighboring nodes.The resulting simple algorithm, Asymmetric Contrastive Learning for Graphs (GraphACL), is easy to implement and does not rely on graph augmentations and homophily assumptions.We provide theoretical and empirical evidence that GraphACL can capture one-hop local neighborhood information and two-hop monophily similarity, which are both important for modeling heterophilic graphs.Experimental results show that the simple GraphACL significantly outperforms state-of-the-art graph contrastive learning and self-supervised learning methods on homophilic and heterophilic graphs.The code of GraphACL is available at https://github.com/tengxiao1/GraphACL.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
Volume36
StatePublished - 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: Dec 10 2023Dec 16 2023

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Simple and Asymmetric Graph Contrastive Learning without Augmentations'. Together they form a unique fingerprint.

Cite this