JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE) ›› 2023, Vol. 58 ›› Issue (3): 101-108.doi: 10.6040/j.issn.1671-9352.0.2022.059

Previous Articles     Next Articles

Low-rank matrix factorization with double Gaussian prior model

WEI Fang, WANG Chang-peng*   

  1. School of Science, Changan University, Xian 710064, Shaanxi, China
  • Published:2023-03-02

Abstract: In order to fit the complex noise better and enhance the robustness of the low-rank matrix factorization model, the dual Gaussian priors are introduced into the traditional Gaussian mixture model, and the model of low-rank matrix factorization with double Gaussian prior(DGP-LRMF)is proposed, it is assumed that two matrices obtained by model decomposition obey Gaussian priori to realize the effective modeling for noise, and the EM algorithm is used to deduce the model parameters in the framework of Bayesian theory. Experimental results show that the proposed model can effectively deal with the data with complex noise and obtain better and more stable denoising performance.

Key words: Gaussian mixture model, low-rank matrix decomposition, Gaussian prior

CLC Number: 

  • TP391
[1] HUANG Shimeng, WOLKOWICZ H. Low-rank matrix completion using nuclear norm minimization and facial reduction[J]. Journal of Global Optimization, 2018, 72(1):5-26.
[2] CHEN Baiyu, YANG Zi, YANG Zhouwang. An algorithm for low-rank matrix factorization and its applications[J]. Neuro-computing, 2018, 275:1012-1020.
[3] 刘璐, 张洪艳, 张良培. 基于光谱加权低秩矩阵分解的高光谱影像去噪方法[J]. 电子科技,2020, 33(5):21-27. LIU Lu, ZHANG Hongyan, ZHANG Liangpei. A method of hyperspectral imaging denoising based on spectral weighted low rank matrix factorization[J]. Journal of Electronic Technology, 2020, 33(5):21-27.
[4] BUCHANAN A M, FITZGIBBON A W. Damped Newton algorithms for matrix factorization with missing data[C] //Conference on Computer Vision and Pattern Recognition. San Diego: IEEE, 2005: 316-322.
[5] CHEN Pei. Optimization algorithms on subspaces: revisiting missing data problem in low-rank matrix[J]. International Journal of Computer Vision, 2008, 80(1):125-142.
[6] OKATANI T, YOSHIDA T, DEGUCHI K. Efficient algorithm for low-rank matrix factorization with missing components and performance comparison of latest algorithms[C] //International Conference on Computer Vision. Barcelona: IEEE, 2011: 842-849.
[7] TORRE F D L, BLACK M J. A framework for robust subspace learning[J]. International Journal of Computer Vision, 2003, 54(1/2):117-142.
[8] KE Q, KANADE T. Robust L1 norm factorization in the presence of outliers and missing data by alternative convex programming[C] //Computer Society Conference on Computer Vision and Pattern Recognition. San Diego: IEEE, 2005: 739-746.
[9] ERIKSSON A, VAN D H A. Efficient computation of robust low-rank matrix approximations in the presence of missing data using the L1 norm[C] //Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco: IEEE, 2010: 771-778.
[10] MENG Deyu, XU Zhongben, ZHANG Lei, et al. A cyclic weighted median method for L1 low-rank matrix factorization with missing entries[C] //Proceedings of the Twenty-Seventh AAAI Conference on Artificial Intelligence. Bellevue: AAAI, 2013: 704-710.
[11] KIM H, PARK H. Sparse non-negative matrix factorizations via alternating nonnegativity-constrained least squares for microarray data analysis[J]. Bioinformatics, 2007, 23(12):1495-1502.
[12] MIN Wenwen, LIU Juan, ZHANG Shihua.Group-sparse SVD models via L-1 and L-0 norm penalties and their applications in biological data[J]. IEEE Transactions on Knowledge and Data Engineering, 2019, 33(2):536-550.
[13] LAKSHMINARAYANAN B, BOUCHARD G, ARCHAMBEAU C. Robust Bayesian matrix factorization[C] //Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics. New York: JMLR, 2011: 425-433.
[14] SALAKHUTDINOV R, MNIH A. Probabilistic matrix factorization[J]. Advances in Neural Information Processing Systems, 2007, 20(2):1257-1264.
[15] WANG Naiyan, YAO Tiansheng, WANG Jingdong, et al. A probabilistic approach to robust matrix factorization[C] //European Conference on Computer Vision. Florence: Springer, 2012: 126-139.
[16] SADDIKI H, MCAULIFFE J, FLAHERTY P. GLAD: a mixed-membership model for heterogeneous tumor subtype classification[J]. Bioinformatics, 2014, 31(2):225-232.
[17] MENG Deyu, TORRE F D L. Robust matrix factorization with unknown noise[C] //International Conference on Computer Vision. Sydney: IEEE, 2014: 1337-1344.
[18] CAO Xiangyong, ZHAO Qian, MENG Deyu, et al. Robust low-rank matrix factorization under general mixture noise distributions[J]. IEEE Transactions on Image Processing, 2016, 25(10):4677-4690.
[19] XU Shuang, ZHANG Chunxia, ZHANG Jiangshe. Adaptive quantile low-rank matrix factorization[J]. Pattern Recognition, 2020, 103:1-16.
[20] WANG Haohui, ZHANG Chihao, ZHANG Shihua. Robust Bayesian matrix decomposition with mixture of Gaussian noise[J]. Neurocomputing, 2021, 449:108-116.
[21] DEMPSTER A P, LAIRD N M, RUBIN D B. Maximum likelihood from incomplete data via the EM algorithm[J]. Journal of the Royal Statistical Society: B, 1977, 39(1):1-38.
[22] ZHENG Yinqiang, LIU Guangcan, SUGIMOTO S, et al. Practical low-rank matrix approximation under robust L1-norm[C] //Conference on Computer Vision and Pattern Recognition. Providence: IEEE, 2012: 1410-1417.
[1] LI Xin-yu, FAN Hui, LIU Jing-lei. Robust clustering based on adaptive graph regularization and low-rank matrix decomposition [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2022, 57(8): 21-38.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!