TY - GEN
T1 - You Need to Look Globally
T2 - 27th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2023
AU - Zhu, Huaisheng
AU - Tang, Xianfeng
AU - Zhao, Tian Xiang
AU - Wang, Suhang
N1 - Funding Information:
This material is based upon work supported by, or in part by, the National Science Foundation (NSF) under grant number IIS-1909702, the Army Research Office (ONR) under grant number W911NF21-1-0198, and Department of Homeland Security (DNS) CINA under grant number E205949D. The findings in this paper do not necessarily reflect the view of the funding agencies.
Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023
Y1 - 2023
N2 - Graph Neural Networks (GNNs) have shown great ability in modeling graph-structured data. However, most current models aggregate information from the local neighborhoods of a node. They may fail to explicitly encode global structure distribution patterns or efficiently model long-range dependencies in the graphs; while global information is very helpful for learning better representations. In particular, local information propagation would become less useful when low-degree nodes have limited neighborhoods, or unlabeled nodes are far away from labeled nodes, which cannot propagate label information to them. Therefore, we propose a new framework GSM-GNN to adaptively combine local and global information to enhance the performance of GNNs. Concretely, it automatically learns representative global topology structures from the graph and stores them in the memory cells, which can be plugged into all existing GNN models to help propagate global information and augment representation learning of GNNs. In addition, these topology structures are expected to contain both feature and graph structure information, and they can represent important and different characteristics of graphs. We conduct experiments on 7 real-world datasets, and the results demonstrate the effectiveness of the proposed framework for node classification.
AB - Graph Neural Networks (GNNs) have shown great ability in modeling graph-structured data. However, most current models aggregate information from the local neighborhoods of a node. They may fail to explicitly encode global structure distribution patterns or efficiently model long-range dependencies in the graphs; while global information is very helpful for learning better representations. In particular, local information propagation would become less useful when low-degree nodes have limited neighborhoods, or unlabeled nodes are far away from labeled nodes, which cannot propagate label information to them. Therefore, we propose a new framework GSM-GNN to adaptively combine local and global information to enhance the performance of GNNs. Concretely, it automatically learns representative global topology structures from the graph and stores them in the memory cells, which can be plugged into all existing GNN models to help propagate global information and augment representation learning of GNNs. In addition, these topology structures are expected to contain both feature and graph structure information, and they can represent important and different characteristics of graphs. We conduct experiments on 7 real-world datasets, and the results demonstrate the effectiveness of the proposed framework for node classification.
UR - http://www.scopus.com/inward/record.url?scp=85163387932&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85163387932&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-33377-4_4
DO - 10.1007/978-3-031-33377-4_4
M3 - Conference contribution
AN - SCOPUS:85163387932
SN - 9783031333767
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 40
EP - 52
BT - Advances in Knowledge Discovery and Data Mining - 27th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2023, Proceedings
A2 - Kashima, Hisashi
A2 - Ide, Tsuyoshi
A2 - Peng, Wen-Chih
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 25 May 2023 through 28 May 2023
ER -