您的位置:山东大学 -> 科技期刊社 -> 《山东大学学报(理学版)》

《山东大学学报(理学版)》 ›› 2022, Vol. 57 ›› Issue (11): 89-101.doi: 10.6040/j.issn.1671-9352.4.2021.199

• • 上一篇    

基于低秩类间稀疏判别最小二乘回归的图像分类

钟堃琰,刘惊雷*   

  1. 烟台大学计算机与控制工程学院, 山东 烟台 264005
  • 发布日期:2022-11-10
  • 作者简介:钟堃琰(1996— ),男,硕士研究生,研究方向为矩阵回归.E-mail:killove9@163.com*通信作者简介:刘惊雷(1970— ),博士,教授,研究方向为人工智能和理论计算机科学.E-mail:jinglei_liu@sina.com
  • 基金资助:
    国家自然科学基金资助项目(61572419,62072391);山东省自然科学基金资助项目(ZR2020MF148)

Image classification based on low-rank inter-class sparsity discriminant least squares regression

ZHONG Kun-yan, LIU Jing-lei*   

  1. School of Computer and Control Engineering, Yantai University, Yantai 264005, Shandong, China
  • Published:2022-11-10

摘要: 在多分类任务中基于最小二乘回归(least squares regression,LSR)的分类器是有效的,但大多数现有方法因使用有限的投影而损失许多判别信息,有的算法只关注样本与目标矩阵的精确拟合而忽略了过拟合问题。为了解决这些问题并提高分类性能,本文提出了一种基于低秩类间稀疏性的判别最小二乘回归(low-rank inter-class sparsity discriminative least squares regression,LRICSDLSR)的多类图像的分类方法。在判别最小二乘回归模型中引入类间稀疏性约束,使得来自同一类的样本间隔大大减小,而来自不同类的样本的间隔增大;对由非负松弛矩阵获得的松弛标签施加低秩约束,以提高其类内紧凑性和相似性;在学习标签上引入了一个额外的正则化项,以避免过拟合问题。实验结果表明,这3个改进有助于学习明显的回归投影,从而实现更好的分类性能。

关键词: 低秩, 类间稀疏, 图像分类, 回归, 投影

Abstract: Classifiers based on Least Squares Regression(LSR)are effective in multi-classification tasks. However, most of the existing methods use limited projections and cause a lot of loss of discriminative information. Some algorithms only focus on the accurate fitting of the sample and the target matrix and ignore the problem of overfitting. In order to solve these problems and improve the classification performance, this paper proposes a multi-class image classification method based on low-rank inter-class sparsity discriminative least squares regression(LRICSDLSR). Introducing the inter-class sparsity constraint in the discriminative least squares regression model, so that the margin of samples from the same class can be greatly reduced, while the margin of samples from different classes can be increased; Apply low-rank constraints to the relaxed labels obtained from the non-negative relaxation matrix to improve their intra-class compactness and similarity; An additional regularization term is introduced on the learning label to avoid overfitting. Experimental results show that it helps to learn more distinguished regression projections to achieve better classification performance.The experimental results on a series of image data sets prove the effectiveness of the method.

Key words: low-rank, inter-class sparsity, image classification, regression, projection

中图分类号: 

  • TP18
[1] ZHANG Zheng, LAI Zhihui, XU Yong, et al. Discriminative elastic-netregularized linear regression[J]. IEEE Transactions on Image Processing, 2017, 26(3):1466-1481.
[2] CHEN Zhe, WU Xiaojun, KITTLER J. A sparse regularized nuclear norm based matrix regression for face recognition with contiguous occlusion[J]. Pattern Recognition Letters, 2019, 125:494-499.
[3] YUAN H L, LI J Y, LAI L L, et al. Low-rank matrix regression for image feature extraction and feature selection-sciencedirect[J]. Information Sciences, 2020, 522:214-226.
[4] RUPPERT D, WAND M P. Multivariate locally weighted least squares regression[J]. The Annals of Statistics, 1994, 22(3):1346-1370.
[5] RUPPERT D, SHEATHER S J, WAND M P. An effective bandwidth selector for local least squares regression[J]. Journal of the American Statistical Association, 1995, 90(432):1257-1270.
[6] TIBSHIRANI R. Regression shrinkage and selection via the LASSO: a retrospective[J]. Journal of the Royal Statistical Society: Series B(Statistical Methodology), 2011, 73(3):267-288.
[7] AN S J, LIU W Q, VENKATESH S. Face recognition using kernel ridge regression[C] //IEEE Conference on Computer Vision & Pattern Recognition. Piscataway: IEEE, 2007.
[8] AVRON H, CLARKSON K, WOODRUFF D. Faster kernel ridge regression using sketching and preconditioning[J]. SIAM Journal on Matrix Analysis and Applications, 2017, 38(4):1116-1138.
[9] STRUTZ T. Data fitting and uncertainty(a practical introduction to weighted least squares and beyond)[M]. Wiesbaden: Vieweg and Teubner, 2010.
[10] JIAO Liecheng, BO Liefeng, WANG Ling. Fast sparse approximation for least squares support vector machine[J]. IEEE Transactions on Neural Networks, 2007, 18(3):685-697.
[11] ABDI H. Partial least squares regression and projection on latent structure regression(PLS Regression)[J]. Wiley Interdisciplinary Reviews: Computational Statistics, 2010, 2(1):97-106.
[12] XIANG Shiming, NIE Feiping, MENG Gaofei. Discriminative least squares regression for multiclass classification and feature selection[J]. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(11):1738-1754.
[13] ZHANG Xuyao, WANG Linfeng, XIANG Shiming, et al. Retargeted least squares regression algorithm[J]. IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(9):2206-2213.
[14] WANG Linfeng, PAN Chuhong. Groupwise retargeted least squares regression[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(4):1352-1358.
[15] FANG Xiaozhao, YONG Xu, LI Xuelong, et al. Regularized label relaxation linear regression[J]. IEEE Transactions on Neural Networks & Learning Systems, 2018, 29(4):1006-1018.
[16] FANG Xiaozhao, TENG Shaohua, LAI Zhihui, et al. Robust latent subspace learning for image classification[J]. IEEE Transactions on Neural Networks and Learning Systems, 2017, 29(6):2502-2515.
[17] WEN Jie, XU Yong, LI Zuoyong, et al. Inter-class sparsity based discriminative least square regression[J]. Neural Networks, 2018, 102(1):36-47.
[18] BOYD S, PARIKH N, HU E C, et al. Distributed optimization and statistical learning via the alternating direction method of multipliers[J]. Foundations and Trends in Machine Learning, 2010, 3(1):1-122.
[19] CAI J F, CANDES J E, SHEN Z W. A singular value thresholding algorithm for matrix completion[J]. SIAM Journal on Optimization, 2010, 20(4):1956-1982.
[20] YUAN Aihong, YOU Mengbo, HE Dongjian. Convex non-negative matrix factorization with adaptive graph for unsupervised feature selection[J]. IEEE Transactions on Cybernetics, 2020, 1(99):1-13.
[21] FANG Xiaozhao, HAN Na, WU Jikang, et al. Approximate low-rank projection learning for feature extraction[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(11):5228-5241.
[22] JIANG Z L, LIN Z, DAVIS L S. Learning a discriminative dictionary for sparse coding via label consistent K-SVD[C] //IEEE Conference on Computer Vision & Pattern Recognition. Piscataway:IEEE, 2011.
[23] LEARNED-MILLER E, HUANG G B, ROYCHOWDHURY A, et al. Labeled faces in the wild: a survey[M] //Advances in Face Detection and Facial Image Analysis. Berlin: Springer International Publishing, 2016.
[24] BOUREAU Y L, BACH F, LECUN Y, et al. Learning mid-level features for recognition[C] //Proc. International Conference on Computer Vision and Pattern Recognition(CVPR'10). Piscataway: IEEE, 2010.
[25] LI Hui, WU Xiaojun. DenseFuse: a fusion approach to infrared and visible images[J]. IEEE Transactions on Image Processing, 2019, 28(5):2614-2623.
[26] XU Tianyang, FENG Zhenhua, WU Xiaojun, et al. Joint group feature selection and discriminative filter learning for robust visual object tracking[J/OL]. arXiv, 2019. https://arxiv.org/abs/1907.13242.
[27] XU Tianyang, FENG Zhenhua, WU Xiaojun. Learning adaptive discriminative correlation filters via temporal consistency preserving spatial feature selection for robust visual object tracking[J]. IEEE Transactions on Image Processing, 2019, 28(11):5596-5609.
[28] HOU Chenping, JIAO Yuanyuan, NIE Feiping. 2D feature selection by sparse matrix regression[J]. IEEE Transactions on Image Processing, 2017, 26(9):4255-4268.
[29] DENG J, DONG W, SOCHER R, et al. ImageNet: a large-scale hierarchical image database[C] //Proc. International Conference on Computer vision and Pattern Recognition(CVPR). Florida: IEEE, 2009.
[30] SIMONYAN K, ZISSERMAN A. Very deep convolutional networks for large-scale image recognition[J/OL]. arXiv, 2014. https://arxiv.org/abs/1409.1556.
[31] HE Kaiming, ZHANG Xiangyu, REN Shaoqing, et al. Deep residual learning for image recognition[C] //Processing of in 2016 IEEE Conference on Computer Vision and Pattern Recognition(CVPR). Piscataway: IEEE, 2016.
[1] 李心雨,范辉,刘惊雷. 基于自适应图调节和低秩矩阵分解的鲁棒聚类[J]. 《山东大学学报(理学版)》, 2022, 57(8): 21-38.
[2] 宁志强,潘峰,钮可,陈培. 基于回归预测的运动矢量可逆信息隐藏[J]. 《山东大学学报(理学版)》, 2022, 57(5): 66-73.
[3] 娘毛措, 陈占寿, 成守尧, 汪肖阳. 具有长记忆误差的线性回归模型参数变点的在线监测[J]. 《山东大学学报(理学版)》, 2022, 57(4): 91-99.
[4] 张要,马盈仓,杨小飞,朱恒东,杨婷. 结合流形结构与柔性嵌入的多标签特征选择[J]. 《山东大学学报(理学版)》, 2021, 56(7): 91-102.
[5] 赵亚琪,任芳国. 关于矩阵的表示及其在熵中应用[J]. 《山东大学学报(理学版)》, 2021, 56(11): 97-104.
[6] 刘静姝,王莉,刘惊雷. 基于循环矩阵投影的Nyström扩展[J]. 《山东大学学报(理学版)》, 2020, 55(7): 55-66.
[7] 王小刚,李冰. 基于核函数方法的逐段线性Tobit回归模型估计[J]. 《山东大学学报(理学版)》, 2020, 55(6): 1-9.
[8] 董哲瑾,王健,钱凌飞,林鸿飞. 一种用户成长性画像的建模方法[J]. 《山东大学学报(理学版)》, 2019, 54(3): 38-45.
[9] 董兴林,齐欣. 基于时滞效应的青岛市两阶段科技投入与产出互动关系[J]. 山东大学学报(理学版), 2018, 53(5): 80-87.
[10] 邹绍辉,张甜. 国际碳期货价格与国内碳价动态关系[J]. 山东大学学报(理学版), 2018, 53(5): 70-79.
[11] 彭家寅. 以真五粒子非最大纠缠态为信道的双向受控隐形传态[J]. 《山东大学学报(理学版)》, 2018, 53(12): 105-113.
[12] 李永明,聂彩玲,刘超,郭建华. 负超可加阵列下非参数回归函数估计的相合性[J]. 《山东大学学报(理学版)》, 2018, 53(12): 69-74.
[13] 陈敬,李寿山,周国栋. 基于双通道LSTM的用户年龄识别方法[J]. 山东大学学报(理学版), 2017, 52(7): 91-96.
[14] 陈霞,陈纯荣. 广义向量变分不等式的间隙函数与误差界[J]. 山东大学学报(理学版), 2017, 52(4): 1-5.
[15] 罗高骏,周良,左可正. 广义投影算子和超广义投影算子的一些新刻画[J]. 山东大学学报(理学版), 2016, 51(4): 43-48.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!