TY - GEN
T1 - Towards robust graph neural networks for noisy graphs with sparse labels
AU - Dai, Enyan
AU - Jin, Wei
AU - Liu, Hui
AU - Wang, Suhang
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/2/11
Y1 - 2022/2/11
N2 - Graph Neural Networks (GNNs) have shown their great ability in modeling graph structured data. However, real-world graphs usually contain structure noises and have limited labeled nodes. The performance of GNNs would drop significantly when trained on such graphs, which hinders the adoption of GNNs on many applications. Thus, it is important to develop noise-resistant GNNs with limited labeled nodes. However, the work on this is rather limited. Therefore, we study a novel problem of developing robust GNNs on noisy graphs with limited labeled nodes. Our analysis shows that both the noisy edges and limited labeled nodes could harm the message-passing mechanism of GNNs. To mitigate these issues, we propose a novel framework which adopts the noisy edges as supervision to learn a denoised and dense graph, which can down-weight or eliminate noisy edges and facilitate message passing of GNNs to alleviate the issue of limited labeled nodes. The generated edges are further used to regularize the predictions of unlabeled nodes with label smoothness to better train GNNs. Experimental results on real-world datasets demonstrate the robustness of the proposed framework on noisy graphs with limited labeled nodes.
AB - Graph Neural Networks (GNNs) have shown their great ability in modeling graph structured data. However, real-world graphs usually contain structure noises and have limited labeled nodes. The performance of GNNs would drop significantly when trained on such graphs, which hinders the adoption of GNNs on many applications. Thus, it is important to develop noise-resistant GNNs with limited labeled nodes. However, the work on this is rather limited. Therefore, we study a novel problem of developing robust GNNs on noisy graphs with limited labeled nodes. Our analysis shows that both the noisy edges and limited labeled nodes could harm the message-passing mechanism of GNNs. To mitigate these issues, we propose a novel framework which adopts the noisy edges as supervision to learn a denoised and dense graph, which can down-weight or eliminate noisy edges and facilitate message passing of GNNs to alleviate the issue of limited labeled nodes. The generated edges are further used to regularize the predictions of unlabeled nodes with label smoothness to better train GNNs. Experimental results on real-world datasets demonstrate the robustness of the proposed framework on noisy graphs with limited labeled nodes.
UR - http://www.scopus.com/inward/record.url?scp=85125771228&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85125771228&partnerID=8YFLogxK
U2 - 10.1145/3488560.3498408
DO - 10.1145/3488560.3498408
M3 - Conference contribution
AN - SCOPUS:85125771228
T3 - WSDM 2022 - Proceedings of the 15th ACM International Conference on Web Search and Data Mining
SP - 181
EP - 191
BT - WSDM 2022 - Proceedings of the 15th ACM International Conference on Web Search and Data Mining
PB - Association for Computing Machinery, Inc
T2 - 15th ACM International Conference on Web Search and Data Mining, WSDM 2022
Y2 - 21 February 2022 through 25 February 2022
ER -