您的位置:山东大学 -> 科技期刊社 -> 《山东大学学报(理学版)》

《山东大学学报(理学版)》 ›› 2025, Vol. 60 ›› Issue (7): 84-93.doi: 10.6040/j.issn.1671-9352.4.2024.839

• • 上一篇    

基于图互信息池化的分层图表示学习

吴辛尧1,2,徐计1*   

  1. 1.省部共建公共大数据国家重点实验室(贵州大学), 贵州 贵阳 550025;2.贵州大学计算机科学与技术学院, 贵州 贵阳 550025
  • 发布日期:2025-07-01
  • 通讯作者: 徐计(1979— ),男,教授,硕士生导师,博士,研究方向为数据挖掘、粒计算和机器学习. E-mail:jixu@gzu.edu.cn
  • 作者简介:吴辛尧(2001— ),男,硕士研究生,研究方向为图神经网络. E-mail:gs.xinyaowu22@gzu.edu.cn*通信作者:徐计(1979— ),男,教授,硕士生导师,博士,研究方向为数据挖掘、粒计算和机器学习. E-mail:jixu@gzu.edu.cn
  • 基金资助:
    国家自然科学基金资助项目(62366008,61966005)

Hierarchical graph representation learning based on graphical mutual information pooling

WU Xinyao1,2, XU Ji1*   

  1. 1. State Key Laboratory of Public Big Data, Guizhou University, Guiyang 550025, Guizhou, China;
    2. School of Computer Science and Technology, Guizhou University, Guiyang 550025, Guizhou, China
  • Published:2025-07-01

摘要: 提出一种基于图互信息的池化算子——图互信息池化(graphical mutual information pooling, GMIPool)。GMIPool利用互信息神经估计度量节点及其对应的支撑图之间的图互信息(包括特征互信息和结构互信息),利用图互信息识别并保留图中的关键节点,构建更为紧凑的粗图。为确保原图和粗图在结构上的一致性,该方法利用节点之间的邻域关联性对粗图的结构进行修正。该方法在多个节点分类任务数据集上进行实验,验证了图互信息池化的有效性。

关键词: 图神经网络, 图池化, 多粒度, 图互信息, 互信息神经估计

Abstract: A graph pooling operator called graphical mutual information pooling(GMIPool). GMIPool is proposed utilizes mutual information neural estimation to measure the mutual information between nodes and their corresponding subgraphs, including both feature mutual information and structural mutual information. It leverages this information to identify and retain key nodes in the graph, constructing a more compact coarsened graph. To ensure structural consistency between the original and coarsened graphs, the method adjusts the coarsened graphs structure using the neighborhood correlations between nodes. Experiments on several node classification task datasets validate the effectiveness of GMIPool.

Key words: graph neural networks, graph pooling, multi-granularity, graphical mutual information, mutual information neural estimation

中图分类号: 

  • TP391
[1] LI Zewen, LIU Fan, YANG Wenjie, et al. A survey of convolutional neural networks: analysis, applications, and prospects[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 33(12):6999-7019.
[2] WU Zonghan, PAN Shirui, CHEN Fengwen, et al. A comprehensive survey on graph neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2020, 32(1):4-24.
[3] DWIVEDI V P, JOSHI C K, LUU A T, et al. Benchmarking graph neural networks[J]. Journal of Machine Learning Research, 2023, 24(43):1-48.
[4] LIU Chuang, ZHAN Yibing, WU Jia, et al. Graph pooling for graph neural networks: progress, challenges, and opportunities[C] //Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence. Macao: IJCAI, 2023:6712-6722.
[5] BIANCHI F M, LACHI V. The expressive power of pooling in graph neural networks[J]. Advances in Neural Information Processing Systems, 2023, 36:71603-71618.
[6] YU Shujian, GIRALDO L G S, PRÍNCIPE J C. Information-theoretic methods in deep neural networks: recent advances and emerging opportunities[C] //Proceedings of the Thirty International Joint Conference on Artificial Intelligence. Montreal: IJCAI, 2021:4669-4678.
[7] HJELM R D, FEDOROV A, LAVOIE-MARCHILDON S, et al. Learning deep representations by mutual information estimation and maximization[C] //International Conference on Learning Representations. New Orleans, Louisiana: OpenReview.net, 2019.
[8] VELICKOVIC P, FEDUS W, HAMILTON W L, et al. Deep graph infomax[C] //International Conference on Learning Representations. New Orleans, Louisiana: OpenReview.net, 2019.
[9] PENG Zhen, HUANG Wenbing, LUO Minnan, et al. Graph representation learning via graphical mutual information maximization[C] //Proceedings of The Web Conference. New York: ACM, 2020:259-270.
[10] PANG Yunsheng, ZHAO Yunxiang, LI Dongsheng. Graph pooling via coarsened graph infomax[C] //Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2021:2177-2181.
[11] VINYALS O, BENGIO S, KUDLUR M. Order matters: sequence to sequence for sets[C] //International Conference on Learning Representations. San Juan: AAAI, 2016.
[12] ZHANG Muhan, CUI Zhicheng, NEUMANN M, et al. An end-to-end deep learning architecture for graph classification[C] //Proceedings of the AAAI Conference on Artificial Intelligence. New Orleans, Louisiana: AAAI, 2018, 32(1):4438-4445.
[13] YING Zhitao, YOU Jiaxuan, CHRISTOPHER M, et al. 2018. Hierarchical graph representation learning with differentiable pooling[C] //Proceedings of the 32nd International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Incoporation, 2018:4805-4815.
[14] YUAN Had, JI Shuiwang. Structpool: structured graph pooling via conditional random fields[C] //Proceedings of the 8th International Conference on Learning Representations. Addis Ababa: IEEE, 2020.
[15] MA Yao, WANG Suhang, TANG Liliang, et al. Graph convolutional networks with eigenpooling[C] //Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York: ACM, 2019:723-731.
[16] GAO Hongyang, JI Shuiwang. Graph U-nets[C] //International Conference on Machine Learning. California: PMLR, 2019:2083-2092.
[17] LEE J, LEE I, KANG J. Self-attention graph pooling[C] //International Conference on Machine Learning. California: PMLR, 2019:3734-3743.
[18] GAO Xing, DAI Wenrui, LI Chenglin, et al. Ipool-information-based pooling in hierarchical graph neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 33(9):5032-5044.
[19] ZHANG Zhen, BU Jiajun, MARTIN E, et al. Hierarchical multi-view graph pooling with structure learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2021, 35(1):545-559.
[20] LI Maosen, CHEN Siheng, ZHANG Ya, et al. Graph cross networks with vertex infomax pooling[J]. Advances in Neural Information Processing Systems. Red Hook: Curran Associates Incoporation, 2020, 33:14093-14105.
[21] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[C] //International Conference on Learning Representations. Toulon: OpenReview.net, 2017.
[22] VELICKOVIC P, CUCURULL G, CASANOVA A, et al. Graph attention networks[C] //International Conference on Learning Representations. Vancouver: OpenReview.net, 2018.
[23] HAMILTON W L, YING Zhitao, LESKOVEC J. Inductive representation learning on large graphs[C] //Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach: Curran Associates Incoporation, 2017:1025-1035.
[24] ZHONG Zhiqiang, LI Chengte, PANG Jun. Multi-grained semantics-aware graph neural networks[J]. IEEE Transactions on Knowledge and Data Engineering, 2022, 35(7):7251-7262.
[25] ZHANG Xu, XU Yonghui, HE Wei, et al. A comprehensive review of the oversmoothing in graph neural networks[C] //CCF Conference on Computer Supported Cooperative Work and Social Computing. Singapore: Springer, 2023:451-465.
[1] 罗奇,苟刚. 基于聚类和群组归一化的多模态对话情绪识别[J]. 《山东大学学报(理学版)》, 2024, 59(7): 105-112.
[2] 黄兴宇,赵明宇,吕子钰. 面向图神经网络表征学习的类别知识探针[J]. 《山东大学学报(理学版)》, 2024, 59(7): 85-94.
[3] 宋苏洋,叶军,曾广财,孙清. 基于优化可辨识矩阵的多粒度粗糙集属性约简算法[J]. 《山东大学学报(理学版)》, 2024, 59(5): 52-62.
[4] 王茜,张贤勇. 不完备邻域加权多粒度决策理论粗糙集及三支决策[J]. 《山东大学学报(理学版)》, 2023, 58(9): 94-104.
[5] 王新生,朱小飞,李程鸿. 标签指导的多尺度图神经网络蛋白质作用关系预测方法[J]. 《山东大学学报(理学版)》, 2023, 58(12): 22-30.
[6] 钱进,汤大伟,洪承鑫. 多粒度层次序贯三支决策模型研究[J]. 《山东大学学报(理学版)》, 2022, 57(9): 33-45.
[7] 孙文鑫,刘玉锋. 基于参数粒的广义多粒度粗糙集[J]. 《山东大学学报(理学版)》, 2022, 57(5): 11-19.
[8] 张斌艳,朱小飞,肖朝晖,黄贤英,吴洁. 基于半监督图神经网络的短文本分类[J]. 《山东大学学报(理学版)》, 2021, 56(5): 57-65.
[9] 张文娟,李进金,林艺东. 基于图的悲观多粒度粗糙集粒度约简[J]. 《山东大学学报(理学版)》, 2021, 56(1): 60-67.
[10] 李金海,贺建君,吴伟志. 多粒度形式概念分析的类属性块优化[J]. 《山东大学学报(理学版)》, 2020, 55(5): 1-12.
[11] 张海洋,马周明,于佩秋,林梦雷,李进金. 多粒度粗糙集近似集的增量方法[J]. 《山东大学学报(理学版)》, 2020, 55(1): 51-61.
[12] 万青,马盈仓,魏玲. 基于多粒度的多源数据知识获取[J]. 《山东大学学报(理学版)》, 2020, 55(1): 41-50.
[13] 李金海,吴伟志,邓硕. 形式概念分析的多粒度标记理论[J]. 《山东大学学报(理学版)》, 2019, 54(2): 30-40.
[14] 胡谦,米据生,李磊军. 多粒度模糊粗糙近似算子的信任结构与属性约简[J]. 山东大学学报(理学版), 2017, 52(7): 30-36.
[15] 汪小燕,沈家兰,申元霞. 基于加权粒度和优势关系的程度多粒度粗糙集[J]. 山东大学学报(理学版), 2017, 52(3): 97-104.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!