In addressing the issue of relatively low accuracy in node classification tasks on heterophily graphs using methods such as MLP and GCN, a Graph Neural Network based on Similarity Random Walk Aggregation (SRW-GNN) was proposed. To address the impact of heterophily on node embeddings, SRW-GNN employs the similarity between nodes as probabilities for conducting random walks. The sampled paths serve as the neighborhood, enabling the model to gather more homophily-based information. To address the issue of insensitivity to node order in most existing graph neural network (GNN) aggregators, a path aggregator based on recurrent neural networks (RNNs) was introduced to simultaneously extract features and order information of each node in the path. Furthermore, nodes exhibit varying preferences for different paths. To adaptively learn the importance of different paths in node encoding, an attention mechanism was employed to dynamically adjust the contributions of each path to the final embedding. Experimental results on several commonly used heterophily graph datasets demonstrate that the proposed SRW-GNN method achieves significantly higher accuracy compared to the methods such as MLP, GCN, H2GCN, HOG-GCN, validating its effectiveness in heterophily graph node classification tasks.
XuBing-bing, CenKe-yan, HuangJun-jie, et al. A survey on graph convolutional neural network[J]. Chinese Journal of Computers, 2020, 43(5): 755-780.
[3]
LeCunY, BottouL, BengioY, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998, 86(11): 2278-2324.
[4]
GilmerJ, SchoenholzS S, RileyP F, et al. Neural message passing for quantum chemistry[C]∥Proceedings of the International Conference on Machine Learning, Sydney, Australia, 2017: 1263-1272.
[5]
KipfT N, WellingM. Semi-supervised classification with graph convolutional networks[J/OL]. [2023-08-11].
[6]
HamiltonW L, YingR, LeskovecJ. Inductive representation learning on large graphs[J/OL]. [2023-08-11].
ZhuJ, RossiR A, RaoA, et al. Graph neural networks with heterophily[J]. AAAI Technical Track on Machine Learning V, 2021, 35(12): 11168-11176.
[9]
WangT, JinD, WangR, et al. Powerful graph convolutional networks with adaptive propagation mechanism for homophily and heterophily[C]∥Proceedings of the AAAI Conference on Artificial Intelligence, Philadelphia, USA, 2022, 36(4): 4210-4218.
[10]
HeD, LiangC, LiuH, et al. Block modeling-guided graph convolutional neural networks[C]∥Proceedings of the AAAI Conference on Artificial Intelligence, Philadelphia, USA, 2022: 4022-4029.
[11]
YanY, HashemiM, SwerskyK, et al. Two sides of the same coin: heterophily and oversmoothing in graph convolutional neural networks[C]∥2022 IEEE International Conference on Data Mining (ICDM), Chennai, India,2022: 1287-1292.
[12]
ZhuM, WangX, ShiC, et al. Interpreting and unifying graph neural networks with an optimization framework[C]∥Proceedings of the Web Conference, New York, NY, USA, 2021: 1215-1226.
[13]
PeiH, WeiB, ChangK C C, et al. Geom-GCN: geometric graph convolutional networks[J/OL]. [2023-08-11].
[14]
ZhuJ, YanY, ZhaoL, et al. Beyond homophily in graph neural networks: current limitations and effective designs[C]∥Advances in Neural Information Processing Systems, Long Beach, USA,2020, 33: 7793-7804.
JinD, WangR, GeM, et al. Raw-gnn: random walk aggregation based graph neural network[C]∥Proceedings of the 31st International Joint Conference on Artificial Intelligence,Shenzhen, China, 2022: 2108-2114.
[17]
FuX Y, ZhangJ N, MengZ Q, et al. Magnn: metapath aggregated graph neural network for heterogeneous graph embedding[C]∥Proceedings of the Web Conference, Taipei, China, 2020: 2331-2341.
[18]
LuanS, HuaC, LuQ, et al. Revisiting heterophily for graph neural networks[J]. Advances in Neural Information Processing Systems, 2022, 35: 1362-1375.
[19]
HochreiterS, SchmidhuberJ. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735-1780.
[20]
ChungJ, GulcehreC, ChoK H, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[J/OL].[2023-08-11].