TY - GEN
T1 - Learning How to Propagate Messages in Graph Neural Networks
AU - Xiao, Teng
AU - Chen, Zhengyu
AU - Wang, Donglin
AU - Wang, Suhang
N1 - Funding Information:
The authors would like to thank the Westlake University and Bright Dream Robotics Joint Institute for the funding support. Suhang Wang is supported by the National Science Foundation under grant number IIS-1909702, IIS-1955851, and Army Research Office (ARO) under grant number W911NF-21-1-0198.
Publisher Copyright:
© 2021 ACM.
PY - 2021/8/14
Y1 - 2021/8/14
N2 - This paper studies the problem of learning message propagation strategies for graph neural networks (GNNs). One of the challenges for graph neural networks is that of defining the propagation strategy. For instance, the choices of propagation steps are often specialized to a single graph and are not personalized to different nodes. To compensate for this, in this paper, we present learning to propagate, a general learning framework that not only learns the GNN parameters for prediction but more importantly, can explicitly learn the interpretable and personalized propagate strategies for different nodes and various types of graphs. We introduce the optimal propagation steps as latent variables to help find the maximum-likelihood estimation of the GNN parameters in a variational Expectation- Maximization (VEM) framework. Extensive experiments on various types of graph benchmarks demonstrate that our proposed frame- work can significantly achieve better performance compared with the state-of-the-art methods, and can effectively learn personalized and interpretable propagate strategies of messages in GNNs.
AB - This paper studies the problem of learning message propagation strategies for graph neural networks (GNNs). One of the challenges for graph neural networks is that of defining the propagation strategy. For instance, the choices of propagation steps are often specialized to a single graph and are not personalized to different nodes. To compensate for this, in this paper, we present learning to propagate, a general learning framework that not only learns the GNN parameters for prediction but more importantly, can explicitly learn the interpretable and personalized propagate strategies for different nodes and various types of graphs. We introduce the optimal propagation steps as latent variables to help find the maximum-likelihood estimation of the GNN parameters in a variational Expectation- Maximization (VEM) framework. Extensive experiments on various types of graph benchmarks demonstrate that our proposed frame- work can significantly achieve better performance compared with the state-of-the-art methods, and can effectively learn personalized and interpretable propagate strategies of messages in GNNs.
UR - http://www.scopus.com/inward/record.url?scp=85114943268&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85114943268&partnerID=8YFLogxK
U2 - 10.1145/3447548.3467451
DO - 10.1145/3447548.3467451
M3 - Conference contribution
AN - SCOPUS:85114943268
T3 - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
SP - 1894
EP - 1903
BT - KDD 2021 - Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
PB - Association for Computing Machinery
T2 - 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2021
Y2 - 14 August 2021 through 18 August 2021
ER -