TY - GEN
T1 - Simple and Asymmetric Graph Contrastive Learning without Augmentations
AU - Xiao, Teng
AU - Zhu, Huaisheng
AU - Chen, Zhengyu
AU - Wang, Suhang
N1 - Publisher Copyright:
© 2023 Neural information processing systems foundation. All rights reserved.
PY - 2023
Y1 - 2023
N2 - Graph Contrastive Learning (GCL) has shown superior performance in representation learning in graph-structured data.Despite their success, most existing GCL methods rely on prefabricated graph augmentation and homophily assumptions.Thus, they fail to generalize well to heterophilic graphs where connected nodes may have different class labels and dissimilar features.In this paper, we study the problem of conducting contrastive learning on homophilic and heterophilic graphs.We find that we can achieve promising performance simply by considering an asymmetric view of the neighboring nodes.The resulting simple algorithm, Asymmetric Contrastive Learning for Graphs (GraphACL), is easy to implement and does not rely on graph augmentations and homophily assumptions.We provide theoretical and empirical evidence that GraphACL can capture one-hop local neighborhood information and two-hop monophily similarity, which are both important for modeling heterophilic graphs.Experimental results show that the simple GraphACL significantly outperforms state-of-the-art graph contrastive learning and self-supervised learning methods on homophilic and heterophilic graphs.The code of GraphACL is available at https://github.com/tengxiao1/GraphACL.
AB - Graph Contrastive Learning (GCL) has shown superior performance in representation learning in graph-structured data.Despite their success, most existing GCL methods rely on prefabricated graph augmentation and homophily assumptions.Thus, they fail to generalize well to heterophilic graphs where connected nodes may have different class labels and dissimilar features.In this paper, we study the problem of conducting contrastive learning on homophilic and heterophilic graphs.We find that we can achieve promising performance simply by considering an asymmetric view of the neighboring nodes.The resulting simple algorithm, Asymmetric Contrastive Learning for Graphs (GraphACL), is easy to implement and does not rely on graph augmentations and homophily assumptions.We provide theoretical and empirical evidence that GraphACL can capture one-hop local neighborhood information and two-hop monophily similarity, which are both important for modeling heterophilic graphs.Experimental results show that the simple GraphACL significantly outperforms state-of-the-art graph contrastive learning and self-supervised learning methods on homophilic and heterophilic graphs.The code of GraphACL is available at https://github.com/tengxiao1/GraphACL.
UR - https://www.scopus.com/pages/publications/85188538653
UR - https://www.scopus.com/inward/citedby.url?scp=85188538653&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85188538653
T3 - Advances in Neural Information Processing Systems
BT - Advances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
A2 - Oh, A.
A2 - Neumann, T.
A2 - Globerson, A.
A2 - Saenko, K.
A2 - Hardt, M.
A2 - Levine, S.
PB - Neural information processing systems foundation
T2 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
Y2 - 10 December 2023 through 16 December 2023
ER -