JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE) ›› 2025, Vol. 60 ›› Issue (7): 69-83.doi: 10.6040/j.issn.1671-9352.7.2024.452

Previous Articles    

Multi-label feature selection with label manifold and dynamic graph constraints

WU Xiaojun1, CHEN Yidan2, HAO Yaojun1, SONG Changwei3, HE Deqing4   

  1. 1. Xinzhou Normal University, Xinzhou 034000, Shanxi, China;
    2. Henan Open University, Zhengzhou 450046, Henan, China;
    3. College of Information and Management Science, Henan Agricultural University, Zhengzhou 450002, Henan, China;
    4. School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan 430074, Hubei, China
  • Published:2025-07-01

Abstract: Multi-label feature selection algorithms with label manifolds and dynamic graph constraints are proposed by integrating adaptive dynamic graph technique and label manifold into an improved linear mapping learning framework. In this algorithm, an improved matrix decomposition technique based on feature self-representation improves the linear mapping model and decouples the correlation between features and labels as well as between different labels. An adaptive dynamic graph technique with Laplace rank constraints is designed to learn a high-quality feature similarity graph. A label manifold based on label relevance is constructed to fully incorporate label information into the training of the algorithm. Numerous experimental results verify that the adaptive dynamic graph technique can effectively improve the quality of the graph matrix and the effectiveness of the proposed algorithm in addressing the multi-label feature selection problem.

Key words: multi-label learning, feature selection, manifold learning, adaptive learning, dynamic graph learning

CLC Number: 

  • TP181
[1] WANG Boyan, HU Xuegang, LI Peipei, et al. Cognitive structure learning model for hierarchical multi-label text classification[J]. Knowledge-Based Systems, 2021, 218:106876.
[2] XIONG Jie, YU Li, NIU Xi, et al. XRR: extreme multi-label text classification with candidate retrieving and deep ranking[J]. Information Sciences, 2023, 622:115-132.
[3] KOMEILI M, LOUIS W, ARMANFARD N, et al. Feature selection for nonstationary data:application to human recognition using medical biometrics[J]. IEEE Transactions on Cybernetics, 2017, 48(5):1446-1459.
[4] JANET J P, KULIK H J. Resolving transition metal chemical space:feature selection for machine learning and structure-property relationships[J]. The Journal of Physical Chemistry A, 2017, 121(46):8939-8954.
[5] BERMINGHAM M L, PONG-WONG R, SPILIOPOULOU A, et al. Application of high-dimensional feature selection: evaluation for genomic prediction in man[J]. Scientific Reports, 2015, 5:10312.
[6] HASTIE T, TIBSHIRANI R, FRIEDMAN J. The elements of statistical learning:data mining, inference, and prediction[J]. The Mathematical Intelligencer, 2005, 27(2):83-85.
[7] SUN Xin, LIU Yanheng, LI Jin, et al. Using cooperative game theory to optimize the feature selection problem[J]. Neurocomputing, 2012, 97:86-93.
[8] KASHEF S, NEZAMABADI-POUR H, NIKPOUR B. Multilabel feature selection:a comprehensive review and guiding experiments[J]. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2018, 8(2):e1240.
[9] YIN Hui, YANG Shuiqiao, SONG Xiangyu, et al. Deep fusion of multimodal features for social media retweet time prediction[J]. World Wide Web, 2021, 24:1027-1044.
[10] SHI Jianbo, MALIK J. Normalized cuts and image segmentation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(8):888-905.
[11] ROWEIS S T, SAUL L K. Nonlinear dimensionality reduction by locally linear embedding[J]. Science, 2000, 290(5500):2323-2326.
[12] ZHANG Yao, MA Yingcang, YANG Xiaofei. Multi-label feature selection based on logistic regression and manifold learning[J]. Applied Intelligence,2022, 52(8):9256-9273.
[13] ZHANG Yao, MA Yingcang. Sparse multi-label feature selection via dynamic graph manifold regularization[J]. International Journal of Machine Learning and Cybernetics, 2023, 14(3):1021-1036.
[14] HU Juncheng, LI Yonghao, XU Gaochao, et al. Dynamic subspace dual-graph regularized multi-label feature selection[J]. Neurocomputing, 2022, 467:184-196.
[15] 李永豪,胡亮,张平,等. 基于动态图拉普拉斯的多标签特征选择[J]. 通信学报,2020,41(12):47-59. LI Yonghao, HU Liang, ZHANG Ping, et al. Multi-label feature selection based on dynamic graph Laplacian[J]. Journal on Communications, 2020, 41(12):47-59.
[16] ZHANG Yao, MA Yingcang. Non-negative multi-label feature selection with dynamic graph constraints[J]. Knowledge-Based Systems, 2022, 238:107924.
[17] LI Yonghao, HU Liang, GAO Wanfu. Robust sparse and low-redundancy multi-label feature selection with dynamic local and global structure preservation[J]. Pattern Recognition, 2023, 134:109120.
[18] ZHANG Minling, ZHOU Zhihua. A review on multi-label learning algorithms[J]. IEEE Transactions on Knowledge and Data Engineering, 2013, 26(8):1819-1837.
[19] BOUTELL M R, LUO Jiebo, SHEN Xipeng, et al. Learning multi-label scene classification[J]. PatternRecognition, 2004, 37(9):1757-1771.
[20] HUANG Jun, LI Guorong, HUANG Qingming, et al. Learning label specific features for multi-label classification[C] // Proceedings of the 2015 IEEE International Conference on Data Mining. Atlantic City, New Jersey, IEEE, 2015:181-190.
[21] ZHANG Jia, LUO Zhiming, LI Candong, et al. Manifold regularized discriminative feature selection for multi-label learning[J]. Pattern Recognition, 2019, 95:136-150.
[22] HU Juncheng, LI Yonghao, GAO Wanfu, et al. Robust multi-label feature selection with dual-graph regularization[J]. Knowledge-Based Systems, 2020, 203:106126.
[23] LI Yonghao, HU Liang, GAO Wanfu. Multi-label feature selection via robust flexible sparse regularization[J]. Pattern Recognition, 2023, 134:109074.
[24] CAI Zhiling, ZHU William. Multi-label feature selection via feature manifold learning and sparsity regularization[J]. International Journal of Machine Learning and Cybernetics, 2018, 9:1321-1334.
[25] GAO Wanfu, LI Yonghao, HU Liang. Multilabel feature selection with constrained latent structure shared term[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(3):1253-1262.
[26] LI Yonghao, HU Liang, GAO Wanfu. Label correlations variation for robust multi-label feature selection[J]. Information Sciences, 2022, 609:1075-1097.
[27] HAN Jiuqi, SUN Zhengya, HAO Hongwei. Selecting feature subset with sparsity and low redundancy for unsupervised learning[J]. Knowledge-Based Systems, 2015, 86:210-223.
[28] ZHANG Yao, HUO Wei, TANG Jun. Multi-label feature selection via latent representation learning and dynamic graph constraints[J]. Pattern Recognition, 2024, 151:110411.
[29] LIN Yaojin, HU Qinghua, LIU Jinghua, et al. Multi-label feature selection based on max-dependency and min-redundancy[J]. Neurocomputing, 2015, 168:92-103.
[30] LEE J, KIM D W. SCLS: multi-label feature selection based on scalable criterion for large label set[J]. Pattern Recognition, 2017, 66:342-352.
[31] HASHEMI A, DOWLATSHAHI M B, NEZAMABADI-POUR H. MFS-MCDM: multi-label feature selection using multi-criteria decision making[J]. Knowledge-Based Systems, 2020, 206:106365.
[32] ZOU Yizhang, HU Xuegang, LI Peipei. Gradient-based multi-label feature selection considering three-way variable interaction[J]. Pattern Recognition, 2024, 145:109900.
[33] XU Wei, GONG Yihong. Document clustering by concept factorization[C] // Proceedings of the 27thAnnual International ACM SIGIR Conference on Research and Development in Information Retrieval. Sheffield: ACM, 2004:202-209.
[34] FAN K. On a theorem of Weyl concerning eigenvalues of linear transformations I [J]. Proceedings of the National Academy of Sciences, 1949, 35(11):652-655.
[35] LEE D D, SEUNG H S. Learning the parts of objects by non-negative matrix factorization[J]. Nature, 1999, 401(6755):788-791.
[36] HUANG Jin, NIE Feiping, HUANG Heng. A new simplex sparse learning model to measure data similarity for clustering[C] // Proceedings of the 24th International Joint Conference on Artificial Intelligence. Burlington: Morgan Kaufmann, 2015:3569-3575.
[37] LEE D D, SEUNG S H. Algorithms for non-negative matrix factorization[J]. Advances in Neural Information Processing Systems, 2001, 13:556-562.
[38] DING C H Q, Li Tao, JORDAN M I. Convex and semi-nonnegative matrix factorizations[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008, 32(1):45-55.
[39] ZHANG Minling, ZHOU Zhihua. ML-KNN: a lazy learning approach to multi-label learning[J]. Pattern Recognition, 2007, 40(7):2038-2048.
[40] DOUGHERTY J, KOHAVI R, SAHAMI M. Supervised and unsupervised discretization of continuous features[M].Massachusetts: Morgan Kaufmann, 1995.
[41] XUE Guotong, ZHONG Ming, LI Jianxin, et al. Dynamic network embedding survey[J]. Neurocomputing, 2022, 472:212-223.
[42] DUNN O J. Multiple comparisons among means[J]. Journal of the American statistical association, 1961, 56(293):52-64.
[43] FRIEDMAN M. A comparison of alternative tests of significance for the problem ofm rankings[J]. The Annals of Mathematical Statistics, 1940, 11(1):86-92.
[1] CHENG Yuxuan, MAO Yu, ZHANG Xiaoqing, ZENG Yixiang, LIN Yaojin. Online multi-label feature selection based on sub-correlation features and neighborhood mutual information [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2024, 59(5): 70-81.
[2] ZHANG Shandan, WENG Wei, XIE Xiaozhu, WEI Bowen, WANG Jinbo, WEN Juan. Global and local relationships based on multi-label classification algorithm with label-specific features [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2024, 59(5): 23-34.
[3] CHEN Yumin, ZHENG Guangyu, JIAO Na. Multi-label learning based on granular neural networks [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2024, 59(5): 1-11.
[4] GAO Hefei, LI Yan, WANG Shuo. Feature selection for partial label learning based on neighborhood rough sets [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2024, 59(5): 100-113.
[5] ZHU Liquan, LIN Yaojin, MAO Yu, CHENG Yuxuan. Multi-label online stream feature selection based on high-dimensional correlation [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2024, 59(5): 90-99.
[6] Chunyu SHI,Yu MAO,Haoyang LIU,Yaojin LIN. Hierarchical feature selection algorithm based on instance correlations [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2024, 59(3): 61-70.
[7] WANG Tinghua, HU Zhenwei, ZHAN Hongxiang. A novel unsupervised feature selection method [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2024, 59(12): 130-140.
[8] ZHANG Zhi-hao, LIN Yao-jin, LU Shun, WU Yi-lin, WANG Chen-xi. Multi-label feature selection with streaming and missing labels [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2022, 57(8): 39-52.
[9] SUN Lin, CHEN Yu-sheng, XU Jiu-cheng. Multilabel feature selection algorithm based on improved ReliefF [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2022, 57(4): 1-11.
[10] SUN Lin, LIANG Na, XU Jiu-cheng. Feature selection using adaptive neighborhood mutual information and spectral clustering [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2022, 57(12): 13-24.
[11] ZHANG Yao, MA Ying-cang, YAND Xiao-fei, ZHU Heng-dong, YANG Ting. Multi-label feature selection based on manifold structure and flexible embedding [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2021, 56(7): 91-102.
[12] Ying YU,Xin-nian WU,Le-wei WANG,Ying-long ZHANG. A multi-label three-way classification algorithm based on label correlation [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2020, 55(3): 81-88.
[13] HUANG Tian-yi, ZHU William. Cost-sensitive feature selection via manifold learning [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2017, 52(3): 91-96.
[14] WAN Zhong-ying, WANG Ming-wen, ZUO Jia-li, WAN Jian-yi. Feature selection combined with the global and local information(GLFS) [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2016, 51(5): 87-93.
[15] LI Zhao,SUN Zhan-,LI Xiao,LI Cheng,. Study on feature selection method based on information loss [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2016, 51(11): 7-12.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!