您的位置:山东大学 -> 科技期刊社 -> 《山东大学学报(理学版)》

《山东大学学报(理学版)》 ›› 2021, Vol. 56 ›› Issue (9): 87-95.doi: 10.6040/j.issn.1671-9352.0.2020.257

• • 上一篇    

一类新型的修正WYL共轭梯度算法

王松华,罗丹*,黎勇   

  1. 百色学院数学与统计学院, 广西 百色 533000
  • 发布日期:2021-09-13
  • 作者简介:王松华(1970— ),男,副教授,研究方向为最优化理论与方法. E-mail:523429892@qq.com*通信作者简介:罗丹(1976— ),女,副教授,研究方向为最优化理论与方法和灰色系统. E-mail:44394415@qq.com
  • 基金资助:
    国家自然科学基金资助项目(11661001,11661009);广西自然科学基金资助项目(2018GXNSFAA281259,2020GXNSFAA159069)

A class of newly modified WYL conjugate gradient algorithms

WANG Song-hua, LUO Dan*, LI Yong   

  1. School of Mathematics and Statistics, Baise University, Baise 533000, Guangxi, China
  • Published:2021-09-13

摘要: 针对大规模无约束优化问题,提出一类新型的修正WYL共轭梯度算法。新算法不依赖任何线搜索且具有充分下降性和信赖域性质,在弱Wolfe-Powell线搜索下全局收敛。初步的数值实验结果表明,新算法是有效的,比经典WYL型共轭梯度法更具竞争性。

关键词: 无约束优化, WYL型共轭梯度法, 充分下降性, 信赖域, 全局收敛性

Abstract: A class of newly WYL conjugate gradient algorithms is proposed for a large scale unconstrained problem. The new algorithms does not depend on any line research and possesses sufficient descent property and trust region trait. The global convergence has proven with weakness Wolfe-Powell line search. Preliminary numerical result has shown that the algorithms are more efficient and more competitive than classical WYL conjugate gradient method.

Key words: unconstrained optimization, WYL conjugate gradient method, sufficient descent property, trust region, global convergence

中图分类号: 

  • O222.4
[1] 戴彧虹,袁亚湘. 非线性共轭梯度法[M]. 上海:上海科学技术出版社,1999. DAI Yuhong, YUAN Yaxiang. Nonlinear conjugate gradient method[M]. Shanghai: Shanghai Science and Technology Publisher, 1999.
[2] POWELL M J D. Nonconvex minimization calculations and the conjugate gradient method[J]. Lecture Notes in Mathematics, 1984, 1066:122-141.
[3] GILBERT J C, NOCEDAL J. Global convergence properties of conjugate gradient methods for optimization[J]. SIAM Journal of Optimization, 2006, 2(1):21-42.
[4] WEI Z X, YAO S W, LIU L Y. The convergence properties of some new conjugate gradient methods[J]. Applied Mathematics and Computation, 2006, 183(2):1341-1350.
[5] YUAN G L,WEI Z X, LI G Y. A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs[J]. Journal of Computational and Applied Mathematics, 2014, 255(1):86-96.
[6] 李春念,袁功林. 求解无约束问题的修正PRP共轭梯度算法[J].西南大学学报(自然科学版),2018,40(9):67-75. LI Chunnian, YUAN Gonglin. A modified Polak-Ribière-Polyak conjugate gradient algorithm for smooth convex programs[J]. Journal of Southwest University(Natural Science Edition), 2018, 40(9):67-75.
[7] YUAN G L, WANG X L, SHENG Z. Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions[J]. Numerical Algorithms, 2020, 84:935-956.
[8] 林穗华. Wolfe线搜索下的修正FR谱共轭梯度法[J]. 山东大学学报(理学版),2017,52(4):6-12. LIN Suihua. A modified FR spectral conjugate gradient method with Wolfe line search[J]. Journal of Shandong University(Natural Science), 2017, 52(4):6-12.
[9] 王开荣,高佩婷. 建立在DY法上的两类混合共轭梯度法[J]. 山东大学学报(理学版),2016,51(6):16-23. WANG Kairong, GAO Peiting. Two mixed conjugate gradient methods based on DY[J]. Journal of Shandong University(Natural Science), 2016, 51(6):16-23.
[10] CHILMERAN H, AL-TAIE S, SAEED E. New parameter of CG-method for unconstrained optimization[J]. AL-Rafidain Journal of Computer Sciences and Mathematics, 2019, 13(1): 22-31.
[11] 王松华,黎勇,吴加其. 一种新型线搜索下的修正3项LS谱共轭梯度法[J]. 安徽大学学报(自然科学版),2019,43(4):40-44. WANG Songhua, LI Yong, WU Jiaqi. A modified three terms LS conjugate gradient method with a new line search[J]. Journal of Anhui University(Natural Science Edition), 2019, 43(4):40-44.
[12] 孙颖异,李健,孙中波,等. 求解无约束优化问题的两类修正的WYL共轭梯度方法[J]. 应用数学,2019,32(2):415-422. SUN Yingyi, LI Jian, SUN Zhongbo, et al. Two modified WYL conjugate gradient methods for unconstrained optimization problem[J]. Mathematica Applicata, 2019, 32(2):415-422.
[13] 简金宝,尹江华,江羡珍. 一个充分下降的有效共轭梯度法[J]. 计算数学,2015,37(4):415-424. JIAN Jinbao, YIN Jianghua, JIANG Xianzhen. An efficient conjugate gradient method with sufficient descent property[J]. Mathematica Numerica Sinica, 2015, 37(4):415-424.
[14] 董晓亮,李卫军. 一类新的WYL型共轭梯度法及其全局收敛性[J]. 河南师范大学学报(自然科学版),2018,46(4):107-112. DONG Xiaoliang, LI Weijun. Global convergence of a new Wei-Yao-Liu type conjugate gradient method[J]. Journal of Henan Normal University(Natural Science Edition), 2018, 46(4):107-112.
[15] DAI Z F, WEN F H. Another improved Wei-Yao-Liu nonlinear conjugate gradient method with sufficient descent property[J]. Applied Mathematics & Computation, 2012, 218(14):7421-7430.
[16] YUAN G L, MENG Z H, LI Y. A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations[J]. Journal of Optimization Theory and Applications, 2016, 168(1):129-152.
[17] 袁亚湘,孙文瑜.最优化理论与方法[M]. 北京:科学技术出版社,2001. YUAN Yaxiang, SUN Wenyu. Optimization theory and methods[M]. Beijing: Beijing Science and Technology Publisher, 2001.
[18] BONGARTZ I, CONN A R, GOUNLD N. Constrained and unconstrained testing environment[J]. ACM Transactions on Mathematical Software, 1993, 50(124):123-160.
[19] ANDREI N. An unconstrained optimization test functions collection[J]. Environmental Science and Technology, 2008, 10(1):6552-6558.
[20] DOLAN E D, MORÉ J J. Benchmarking optimization software with performance profiles[J]. Mathematical Programming, 2001, 91(2):201-213.
[1] 林穗华. Wolfe线搜索下的修正FR谱共轭梯度法[J]. 山东大学学报(理学版), 2017, 52(4): 6-12.
[2] 郑秀云,史加荣. Armijo型线搜索下的全局收敛共轭梯度法[J]. 山东大学学报(理学版), 2017, 52(1): 98-101.
[3] 王开荣,高佩婷. 建立在DY法上的两类混合共轭梯度法[J]. 山东大学学报(理学版), 2016, 51(6): 16-23.
[4] 王开荣,王书敏. 具有充分下降性的修正型混合共轭梯度法[J]. J4, 2013, 48(09): 78-84.
[5] 冯琳1,2,段复建1,和文龙1. 基于简单二次函数模型的滤子非单调信赖域算法[J]. J4, 2012, 47(5): 108-114.
[6] 高宝,孙清滢. 基于Zhang H C非单调技术的修正HS共轭梯度算法[J]. J4, 2011, 46(7): 106-111.
[7] 程李晴1,2, 石巧连2. 一种新的混合共轭梯度算法[J]. J4, 2010, 45(6): 81-85.
[8] 程李晴. 一类共轭梯度法的全局收敛性[J]. J4, 2010, 45(5): 101-105.
[9] 王开荣,曹伟,王银河. Armijo型线搜索下的谱CD共轭梯度法[J]. J4, 2010, 45(11): 104-108.
[10] . 一类新的Wolfe线性搜索下的记忆梯度法[J]. J4, 2009, 44(7): 33-37.
[11] 孙敏 . 一种满足夹角性质的超记忆梯度方法[J]. J4, 2008, 43(6): 68-70 .
[12] 刘利英,李 莹, . 强Wolfe-Powell线搜索下共轭梯度法的全局收敛性[J]. J4, 2008, 43(5): 54-57 .
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!