[1]陈林凯,毛国君.自适应频率和动态节点嵌入的图卷积网络[J].福建工程学院学报,2023,21(01):78-83.[doi:10.3969/j.issn.1672-4348.2023.01.012]
 CHEN Linkai,MAO Guojun.An adaptive frequency and dynamic node embedding based graph convolutional network[J].Journal of FuJian University of Technology,2023,21(01):78-83.[doi:10.3969/j.issn.1672-4348.2023.01.012]
点击复制

自适应频率和动态节点嵌入的图卷积网络()
分享到:

《福建工程学院学报》[ISSN:2097-3853/CN:35-1351/Z]

卷:
第21卷
期数:
2023年01期
页码:
78-83
栏目:
出版日期:
2023-02-25

文章信息/Info

Title:
An adaptive frequency and dynamic node embedding based graph convolutional network
作者:
陈林凯毛国君
福建工程学院计算机科学与数学学院
Author(s):
CHEN Linkai MAO Guojun
School of Computer Science and Mathematics, Fujian University of Technology
关键词:
图神经网络图卷积神经网络过平滑节点分类频率自适应
Keywords:
graph neural networks graph convolutional neural networks over-smoothing node classification frequency adaptation
分类号:
TP18
DOI:
10.3969/j.issn.1672-4348.2023.01.012
文献标志码:
A
摘要:
图卷积网络由于能够直接处理图结构数据的优点而受到广泛研究。当前的多数图卷积网络是基于图信号的平滑性(低频信息),且不能根据各节点适合的接受域生成对应的节点嵌入,随着网络层数的增加,易出现图卷积网络特有的过平滑问题而导致性能下降。为此,提出了基于自适应频率和动态节点嵌入的图卷积网络模型(adaptive frequency and dynamic node embedding based graph convolu?tional network,FDGCN)。FDGCN 模型能够自适应聚合不同频率的信息,同时利用每层网络的输出,平衡每个节点来自全局和局部领域的信息,动态地调节节点嵌入。通过在4 个公共数据集上进行实验,对比了6 个现有模型,证明了FDGCN 模型的有效性。
Abstract:
Graph convolutional networks have been extensively studied due to their advantages of being able to directly handle graph-structured data. Most of the current graph convolutional networks are based on the smoothness of the graph signal (low frequency information) and cannot generate corresponding node embedding according to the suitable acceptance domain of each node. However, as the number of network layers increases, the problem of over-smoothing unique to graph convolutional networks is prone to occur, resulting in performance degradation. Therefore, an adaptive frequency and dynamic node embedding based graph convolutional network (FDGCN) was proposed. FDGCN model is capable of adaptively aggregating information at different frequencies; meanwhile, it dynamically adjusts node embedding by using the output of each network layer to balance the information from the global and local domains of each node. Experiments were conducted on four public datasets comparing six existing models to demonstrate the effectiveness of the FDGCN model.

参考文献/References:

[1]LI Q M,HAN Z C,WU X M. Deeper insights into graph convolutional networks for semi-supervised learning[J]. Proceedings of the AAAI Conference on Artificial Intelligence,2018,32(1):3538-3545.[2]CHEN M,WEI Z W,HUANG Z F,et al. Simple and deep graph convolutional networks[EB/OL]. 2020:arXiv:2007.02133[cs.LG]. https:∥arxiv.org/abs/2007.02133.[3]BO D Y,WANG X,SHI C,et al. Beyond low-frequency information in graph convolutional networks[J]. Proceedings of the AAAI Conference on Artificial Intelligence,2021,35(5):3950-3957.[4]XU K, LI C, TIAN Y, et al. Representation learning on graphs with jumping knowledge networks[C]∥ International Conference on Machine Learning. New York: ACM, 2018: 5453-5462.[5]RONG Y, HUANG W, XU T, et al. DropEdge: towards deep graph convolutional networks on node classification[C]∥International Conference on Learning Representations. Addis Ababa, Ethiopia: OpenReview.net, 2019.[6]XU B B,SHEN H W,CAO Q,et al. Graph convolutional networks using heat kernel for semi-supervised learning[C]∥Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. California:International Joint Conferences on Artificial Intelligence Organization,2019:1928-1934.[7]LIU M,GAO H Y,JI S W. Towards deeper graph neural networks[C]∥Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York:ACM,2020:338-348. [8]SEN P,NAMATA G,BILGIC M,et al. Collective classification in network data[J]. AI Magazine,2008,29(3):93. [9]TANG J,SUN J M,WANG C,et al. Social influence analysis in largescale networks[C]∥Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining-KDD09. New York:ACM Press,2009:807-816. [10]WU F, SOUZA A, ZHANG T, et al. Simplifying graph convolutional networks[C]∥International Conference on Machine Learning. California, 2019: 6861-6871.[11]DEFFERRARD M,BRESSON X,VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[C]∥NIPS16:Proceedings of the 30th International Conference on Neural Information Processing Systems, 2016:3844-3852. [12]XU K, HU W, LESKOVEC J, et al. How powerful are graph neural networks?[C]∥International Conference on Learning Representations. New Orleans, 2019.[13]VELI?KOVI? P, CUCURULL G, CASANOVA A, et al. Graph Attention Networks[C]∥International Conference on Learning Representations, Vancouver, 2018.[14]GASTEIGER J, BOJCHEVSKI A, GNNEMANN S. Predict then propagate: graph neural networks meet personalized PageRank[C]∥International Conference on Learning Representations, 2018.[15]HAMILTON W L,YING R,LESKOVEC J. Inductive representation learning on large graphs[EB/OL]. 2017:arXiv:1706.02216[cs.SI]. https:∥arxiv.org/abs/1706.02216.

相似文献/References:

[1]郭宝椿,李佐勇,陈健,等.融合长短时记忆与图结构学习的水库水位预测[J].福建工程学院学报,2024,22(01):90.[doi:10.3969/j.issn.2097-3853.2024.01.013]
 GUO Baochun,LI Zuoyong,CHEN Jian,et al.Reservoir level prediction via integrating long short-term memory and graph structure learning[J].Journal of FuJian University of Technology,2024,22(01):90.[doi:10.3969/j.issn.2097-3853.2024.01.013]

更新日期/Last Update: 2023-02-25