JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE) ›› 2022, Vol. 57 ›› Issue (11): 89-101.doi: 10.6040/j.issn.1671-9352.4.2021.199

Previous Articles    

Image classification based on low-rank inter-class sparsity discriminant least squares regression

ZHONG Kun-yan, LIU Jing-lei*   

  1. School of Computer and Control Engineering, Yantai University, Yantai 264005, Shandong, China
  • Published:2022-11-10

Abstract: Classifiers based on Least Squares Regression(LSR)are effective in multi-classification tasks. However, most of the existing methods use limited projections and cause a lot of loss of discriminative information. Some algorithms only focus on the accurate fitting of the sample and the target matrix and ignore the problem of overfitting. In order to solve these problems and improve the classification performance, this paper proposes a multi-class image classification method based on low-rank inter-class sparsity discriminative least squares regression(LRICSDLSR). Introducing the inter-class sparsity constraint in the discriminative least squares regression model, so that the margin of samples from the same class can be greatly reduced, while the margin of samples from different classes can be increased; Apply low-rank constraints to the relaxed labels obtained from the non-negative relaxation matrix to improve their intra-class compactness and similarity; An additional regularization term is introduced on the learning label to avoid overfitting. Experimental results show that it helps to learn more distinguished regression projections to achieve better classification performance.The experimental results on a series of image data sets prove the effectiveness of the method.

Key words: low-rank, inter-class sparsity, image classification, regression, projection

CLC Number: 

  • TP18
[1] ZHANG Zheng, LAI Zhihui, XU Yong, et al. Discriminative elastic-netregularized linear regression[J]. IEEE Transactions on Image Processing, 2017, 26(3):1466-1481.
[2] CHEN Zhe, WU Xiaojun, KITTLER J. A sparse regularized nuclear norm based matrix regression for face recognition with contiguous occlusion[J]. Pattern Recognition Letters, 2019, 125:494-499.
[3] YUAN H L, LI J Y, LAI L L, et al. Low-rank matrix regression for image feature extraction and feature selection-sciencedirect[J]. Information Sciences, 2020, 522:214-226.
[4] RUPPERT D, WAND M P. Multivariate locally weighted least squares regression[J]. The Annals of Statistics, 1994, 22(3):1346-1370.
[5] RUPPERT D, SHEATHER S J, WAND M P. An effective bandwidth selector for local least squares regression[J]. Journal of the American Statistical Association, 1995, 90(432):1257-1270.
[6] TIBSHIRANI R. Regression shrinkage and selection via the LASSO: a retrospective[J]. Journal of the Royal Statistical Society: Series B(Statistical Methodology), 2011, 73(3):267-288.
[7] AN S J, LIU W Q, VENKATESH S. Face recognition using kernel ridge regression[C] //IEEE Conference on Computer Vision & Pattern Recognition. Piscataway: IEEE, 2007.
[8] AVRON H, CLARKSON K, WOODRUFF D. Faster kernel ridge regression using sketching and preconditioning[J]. SIAM Journal on Matrix Analysis and Applications, 2017, 38(4):1116-1138.
[9] STRUTZ T. Data fitting and uncertainty(a practical introduction to weighted least squares and beyond)[M]. Wiesbaden: Vieweg and Teubner, 2010.
[10] JIAO Liecheng, BO Liefeng, WANG Ling. Fast sparse approximation for least squares support vector machine[J]. IEEE Transactions on Neural Networks, 2007, 18(3):685-697.
[11] ABDI H. Partial least squares regression and projection on latent structure regression(PLS Regression)[J]. Wiley Interdisciplinary Reviews: Computational Statistics, 2010, 2(1):97-106.
[12] XIANG Shiming, NIE Feiping, MENG Gaofei. Discriminative least squares regression for multiclass classification and feature selection[J]. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(11):1738-1754.
[13] ZHANG Xuyao, WANG Linfeng, XIANG Shiming, et al. Retargeted least squares regression algorithm[J]. IEEE Transactions on Neural Networks and Learning Systems, 2015, 26(9):2206-2213.
[14] WANG Linfeng, PAN Chuhong. Groupwise retargeted least squares regression[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(4):1352-1358.
[15] FANG Xiaozhao, YONG Xu, LI Xuelong, et al. Regularized label relaxation linear regression[J]. IEEE Transactions on Neural Networks & Learning Systems, 2018, 29(4):1006-1018.
[16] FANG Xiaozhao, TENG Shaohua, LAI Zhihui, et al. Robust latent subspace learning for image classification[J]. IEEE Transactions on Neural Networks and Learning Systems, 2017, 29(6):2502-2515.
[17] WEN Jie, XU Yong, LI Zuoyong, et al. Inter-class sparsity based discriminative least square regression[J]. Neural Networks, 2018, 102(1):36-47.
[18] BOYD S, PARIKH N, HU E C, et al. Distributed optimization and statistical learning via the alternating direction method of multipliers[J]. Foundations and Trends in Machine Learning, 2010, 3(1):1-122.
[19] CAI J F, CANDES J E, SHEN Z W. A singular value thresholding algorithm for matrix completion[J]. SIAM Journal on Optimization, 2010, 20(4):1956-1982.
[20] YUAN Aihong, YOU Mengbo, HE Dongjian. Convex non-negative matrix factorization with adaptive graph for unsupervised feature selection[J]. IEEE Transactions on Cybernetics, 2020, 1(99):1-13.
[21] FANG Xiaozhao, HAN Na, WU Jikang, et al. Approximate low-rank projection learning for feature extraction[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(11):5228-5241.
[22] JIANG Z L, LIN Z, DAVIS L S. Learning a discriminative dictionary for sparse coding via label consistent K-SVD[C] //IEEE Conference on Computer Vision & Pattern Recognition. Piscataway:IEEE, 2011.
[23] LEARNED-MILLER E, HUANG G B, ROYCHOWDHURY A, et al. Labeled faces in the wild: a survey[M] //Advances in Face Detection and Facial Image Analysis. Berlin: Springer International Publishing, 2016.
[24] BOUREAU Y L, BACH F, LECUN Y, et al. Learning mid-level features for recognition[C] //Proc. International Conference on Computer Vision and Pattern Recognition(CVPR'10). Piscataway: IEEE, 2010.
[25] LI Hui, WU Xiaojun. DenseFuse: a fusion approach to infrared and visible images[J]. IEEE Transactions on Image Processing, 2019, 28(5):2614-2623.
[26] XU Tianyang, FENG Zhenhua, WU Xiaojun, et al. Joint group feature selection and discriminative filter learning for robust visual object tracking[J/OL]. arXiv, 2019. https://arxiv.org/abs/1907.13242.
[27] XU Tianyang, FENG Zhenhua, WU Xiaojun. Learning adaptive discriminative correlation filters via temporal consistency preserving spatial feature selection for robust visual object tracking[J]. IEEE Transactions on Image Processing, 2019, 28(11):5596-5609.
[28] HOU Chenping, JIAO Yuanyuan, NIE Feiping. 2D feature selection by sparse matrix regression[J]. IEEE Transactions on Image Processing, 2017, 26(9):4255-4268.
[29] DENG J, DONG W, SOCHER R, et al. ImageNet: a large-scale hierarchical image database[C] //Proc. International Conference on Computer vision and Pattern Recognition(CVPR). Florida: IEEE, 2009.
[30] SIMONYAN K, ZISSERMAN A. Very deep convolutional networks for large-scale image recognition[J/OL]. arXiv, 2014. https://arxiv.org/abs/1409.1556.
[31] HE Kaiming, ZHANG Xiangyu, REN Shaoqing, et al. Deep residual learning for image recognition[C] //Processing of in 2016 IEEE Conference on Computer Vision and Pattern Recognition(CVPR). Piscataway: IEEE, 2016.
[1] LI Xin-yu, FAN Hui, LIU Jing-lei. Robust clustering based on adaptive graph regularization and low-rank matrix decomposition [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2022, 57(8): 21-38.
[2] NING Zhi-qiang, PAN Feng, NIU Ke, CHEN Pei. Reversible data hiding in motion vector based on regression prediction [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2022, 57(5): 66-73.
[3] NIANG Mao-cuo, CHEN Zhan-shou, CHENG Shou-yao, WANG Xiao-yang. Online monitoring of parameter changes in linear regression model with long memory errors [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2022, 57(4): 91-99.
[4] Yihuo A-ga, HAI Jin-ke. On Coleman outer automorphism groups of a class of metacyclic groups [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2021, 56(8): 53-57.
[5] ZHANG Yao, MA Ying-cang, YAND Xiao-fei, ZHU Heng-dong, YANG Ting. Multi-label feature selection based on manifold structure and flexible embedding [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2021, 56(7): 91-102.
[6] ZHAO Ya-qi, REN Fang-guo. Representation of matrices and its applications to entropy [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2021, 56(11): 97-104.
[7] LIU Jing-shu, WANG Li, LIU Jing-lei. Nyström extension based on circulant matrix projection [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2020, 55(7): 55-66.
[8] WANG Xiao-gang, LI Bing. Piecewise linear Tobit regression model estimation based on kernel function method [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2020, 55(6): 1-9.
[9] WU Hong-yi, HAI Jin-ke. On Coleman automorphism groups of generalized dihedral groups [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2020, 55(12): 37-39.
[10] Zhe-jin DONG,Jian WANG,Ling-fei QIAN,Hong-fei LIN. A modeling method of user growth profile [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2019, 54(3): 38-45.
[11] DONG Xing-lin, QI Xin. Interactive relationship between two stages of technology input and output in Qingdao city based on time delay effect [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(5): 80-87.
[12] ZOU Shao-hui, ZHANG Tian. Interaction relationship between international carbon future price and domestic carbon price [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(5): 70-79.
[13] LI Yong-ming, NIE Cai-ling, LIU Chao, GUO Jian-hua. Consistency of estimator of nonparametric regression function for arrays of rowwise NSD [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(12): 69-74.
[14] CHEN Jing, LI Shou-shan, ZHOU Guo-dong. User age regression with dual-channel LSTM [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2017, 52(7): 91-96.
[15] CHEN Xia, CHEN Chun-rong. Gap functions and error bounds for generalized vector variational inequalities [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2017, 52(4): 1-5.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!