TY - GEN
T1 - Learning to Drop
T2 - 14th ACM International Conference on Web Search and Data Mining, WSDM 2021
AU - Luo, Dongsheng
AU - Cheng, Wei
AU - Yu, Wenchao
AU - Zong, Bo
AU - Ni, Jingchao
AU - Chen, Haifeng
AU - Zhang, Xiang
N1 - Funding Information:
This project was partially supported by NSF projects IIS-1707548 and CBET-1638320.
Publisher Copyright:
© 2021 ACM.
PY - 2021/8/3
Y1 - 2021/8/3
N2 - Graph Neural Networks (GNNs) have shown to be powerful tools for graph analytics. The key idea is to recursively propagate and aggregate information along the edges of the given graph. Despite their success, however, the existing GNNs are usually sensitive to the quality of the input graph. Real-world graphs are often noisy and contain task-irrelevant edges, which may lead to suboptimal generalization performance in the learned GNN models. In this paper, we propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of GNNs by learning to drop task-irrelevant edges. PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks. To take into consideration the topology of the entire graph, the nuclear norm regularization is applied to impose the low-rank constraint on the resulting sparsified graph for better generalization. PTDNet can be used as a key component in GNN models to improve their performances on various tasks, such as node classification and link prediction. Experimental studies on both synthetic and benchmark datasets show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
AB - Graph Neural Networks (GNNs) have shown to be powerful tools for graph analytics. The key idea is to recursively propagate and aggregate information along the edges of the given graph. Despite their success, however, the existing GNNs are usually sensitive to the quality of the input graph. Real-world graphs are often noisy and contain task-irrelevant edges, which may lead to suboptimal generalization performance in the learned GNN models. In this paper, we propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of GNNs by learning to drop task-irrelevant edges. PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks. To take into consideration the topology of the entire graph, the nuclear norm regularization is applied to impose the low-rank constraint on the resulting sparsified graph for better generalization. PTDNet can be used as a key component in GNN models to improve their performances on various tasks, such as node classification and link prediction. Experimental studies on both synthetic and benchmark datasets show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
UR - http://www.scopus.com/inward/record.url?scp=85102667185&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85102667185&partnerID=8YFLogxK
U2 - 10.1145/3437963.3441734
DO - 10.1145/3437963.3441734
M3 - Conference contribution
AN - SCOPUS:85102667185
T3 - WSDM 2021 - Proceedings of the 14th ACM International Conference on Web Search and Data Mining
SP - 779
EP - 787
BT - WSDM 2021 - Proceedings of the 14th ACM International Conference on Web Search and Data Mining
PB - Association for Computing Machinery, Inc
Y2 - 8 March 2021 through 12 March 2021
ER -