JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE) ›› 2024, Vol. 59 ›› Issue (3): 51-60.doi: 10.6040/j.issn.1671-9352.7.2023.4633

•   • Previous Articles     Next Articles

Grey wolf optimization algorithm based on multi-strategy combination and its application

Hongwu QIN1,2(),Lizheng WANG1,*(),Yu FU1,Muxuan SUI1,Binggao HE1,2   

  1. 1. College of Electronic Information Engineering, Changchun University, Changchun 130000, Jilin, China
    2. Jilin Provincial Key Laboratory of Human Health Status Identification and Function Enhancement (Changchun University), Changchun 130022, Jilin, China
  • Received:2023-04-29 Online:2024-03-20 Published:2024-03-06
  • Contact: Lizheng WANG E-mail:qinhongwu@ccu.edu.cn;wangliz0117@163.com

Abstract:

The standard grey wolf optimizer (GWO) algorithm has issues such as difficulty balancing local exploration and global development. A multi-strategy grey wolf optimization algorithm (MSGWO), based on the fusion of various strategies, is presented to address such problems. First, the grey wolf algorithm introduces the Tent map and a nonlinear convergence factor. Then, to coordinate attempts in the GWO optimization process, the paper applies three learning strategies: extensive learning, elite learning, and coordinated learning. Finally, the paper uses roulette wheel for strategy selection to obtain more diverse wolf positions and globally representative individuals and utilizes benchmark function testing to compare algorithm variations. The outcomes demonstrate that the MSGWO algorithm has a faster convergence speed and a good balance between local development and global search. Based on this, the echo state networks (ESN) hyperparameter for regression prediction is optimized using the MSGWO method. The experiment demonstrates that the MSGWO algorithm performs optimally with an average absolute percentage error of 0.38 percent and a fitting degree of 0.98.

Key words: grey wolf optimizer, multiple strategies, roulette, convergence factor, echo state network

CLC Number: 

  • TP301.6

Fig.1

MSGWO algorithm flowchart"

Table 1

Unimodal standard reference function"

表达式 维度 搜索区间 最优值
$f_1(x)=\sum\limits_{i=1}^n x_i^2$ 30 [-100, 100] 0
$f_2(x)=\sum\limits_{i=1}^n\left|x_i\right|+\prod\limits_{i=1}^n\left|x_i\right|$ 30 [-10, 10] 0
$f_3(x)=\sum\limits_{i=1}^n\left(\sum\limits_{j-1}^i x_j\right)^2$ 30 [-100, 100] 0
$f_4(x)=max_i\left\{\left|x_i\right|, 1 \leqslant i \leqslant n\right\}$ 30 [-100, 100] 0
$f_5(x)=\sum\limits_{i=1}^{n-1}\left[100\left(x_{i+1}-x_i^2\right)^2+\left(x_i-1\right)^2\right]$ 30 [-30, 30] 0
$f_6(x)=\sum\limits_{i=1}^n\left(\left[x_i+0.5\right]\right)^2$ 30 [-100, 100] 0
$f_7(x)=\sum\limits_{i=1}^n i x_i^4+\operatorname{random}[0, 1)$ 30 [-1.28, 1.28] 0

Table 2

Multimodal standard reference function"

表达式 维度 搜索区间 最优值
$f_8(x)=\sum\limits_{i=1}^n-x_i \sin \left(\sqrt{\left|x_i\right|}\right)$ 30 [-500, 500] -2 094.9
$f_9(x)=\sum\limits_{i=1}^n\left[x_i^2-10 \cos \left(2 \pi x_i\right)+10\right]$ 30 [-5.12, 5.12] 0
$f_{10}(x)=-20 \exp \left(-0.2 \sqrt{1 / n \sum\limits_{i=1}^n x_i^2}-\exp \left(1 / n \sum\limits_{i=1}^n \cos \left(2 \pi x_i\right)\right)\right)+20+\mathrm{e}$ 30 [-32, 32] 0
$f_{11}(x)=1 / 4000 \sum\limits_{i=1}^n x_i^2-\prod\limits_{i=1}^n \cos \left(x_i / \sqrt{i}\right)+1$ 30 [-600, 600] 0
$\begin{aligned}f_{12}(x)= & \pi / n\left\{10 \sin \left(\pi y_1\right)+\sum\limits_{i=1}^{n-1}\left(y_i-1\right)^2\left[1+\sin \left(\pi y_{i+1}\right)\right]+\left(y_n-1\right)^2\right\} \\& +\sum\limits_{i=1}^n u\left(x_i, 10, 100, 4\right), y_i=1+\frac{x_i+1}{4}\end{aligned}$ 30 [-50, 50] 0

Table 3

Comparison of experimental data for unimodal functions"

GWO MIGWO MAGWO PSO MSGWO
f1 平均值 7.50E-28 2.78E-36 2.22E+03 3.85E-04 0.00E+00
标准差 1.46E-27 4.03E-36 9.25E+02 1.40E-03 0.00E+00
f2 平均值 1.18E-16 8.47E-22 1.71E+01 3.93E-02 5.25E-219
标准差 1.23E-16 1.15E-21 4.42E+00 4.76E-02 0.00E+00
f3 平均值 2.63E-05 1.56E-06 4.42E+04 8.44E+01 0.00E+00
标准差 9.73E-05 4.85E-06 9.28E+03 2.34E+01 0.00E+00
f4 平均值 5.02E-07 2.32E-09 8.57E+01 1.10E+00 7.80E-291
标准差 5.07E-07 3.34E-09 4.88E+00 2.05E-01 0.00E+00
f5 平均值 2.71E+01 2.68E+01 2.47E+06 9.96E+01 2.79E+01
标准差 6.72E-01 6.40E-01 1.58E+06 6.51E+01 7.37E-01
f6 平均值 8.51E-01 6.86E-01 2.73E+03 1.48E-04 1.81E+00
标准差 4.45E-01 3.55E-01 8.68E+02 1.38E-04 7.56E-01
f7 平均值 2.00E-03 1.50E-03 1.33E+00 1.84E-01 5.30E-05
标准差 1.10E-03 6.94E-04 6.06E-01 6.87E-02 6.84E-05

Table 4

Comparison of experimental data on multimodal functions"

GWO MIGWO MAGWO PSO MSGWO
f8 平均值 -6.03E+03 -5.55E+03 -4.49E+03 -4.89E+03 -5.82E+03
标准差 9.65E+02 1.23E+03 2.87E+02 1.28E+03 1.47E+03
f9 平均值 2.74E+00 1.82E-01 2.44E+02 5.94E+01 0.00E+00
标准差 3.49E+00 9.98E-01 3.42E+01 1.56E+01 0.00E+00
f10 平均值 9.62E-14 2.29E-14 2.00E+01 2.76E-01 4.20E-15
标准差 1.37E-14 4.70E-15 5.70E-03 5.55E-01 9.01E-16
f11 平均值 7.65E-04 8.53E-04 2.73E+01 7.60E-03 0.00E+00
标准差 2.90E-03 3.30E-03 1.42E+01 8.70E-03 0.00E+00
f12 平均值 5.27E-02 3.59E-02 2.75E+06 1.04E-02 1.46E-01
标准差 2.97E-02 1.52E-02 2.58E+06 4.17E-02 7.06E-02

Fig.2

Convergence diagram of function f1—f12 curve"

Table 5

Comparison of experimental results and data"

MAE MAPE RMSEP R2
LSTM 3.700 4 0.008 1 4.708 6 0.945 05
ELM 3.262 7 0.007 2 4.181 9 0.938 56
ESN 3.532 7 0.007 8 4.452 1 0.930 25
GWO-ESN 1.967 1 0.004 3 2.506 8 0.977 88
MAGWO-ESN 1.986 2 0.004 4 2.524 6 0.977 57
MSGWO-ESN 1.725 1 0.003 8 2.198 8 0.982 99

Fig.3

Comparison of prediction results"

Fig.4

Prediction result error chart"

1 SULTANA N , HOSSAIN S , ABUSAAD M , et al. Prediction of biodiesel production from microalgal oil using Bayesian optimization algorithm-based machine learning approaches[J]. Fuel, 2022, 309, 122184.
doi: 10.1016/j.fuel.2021.122184
2 ZHANG Yudong, WANG Shuihua, JI Genlin, et al. An MR brain images classifier system via particle swarm optimization and kernel support vector machine[J/OL]. The Scientific World Journal, 2013, 2013: 130134[2022-10-10]. https://doi.org/10.1155/2013/130134.
3 李真, 王帆, 王冉珺. 一种结合灰狼算法的粒子群优化算法[J]. 计算机测量与控制, 2021, 29 (10): 217- 222.
LI Zhen , WANG Fan , WANG Ranjun . A particle swarm optimization algorithm combined with grey wolf algorithm[J]. Computer Measurement and Control, 2021, 29 (10): 217- 222.
4 MORRIS G M , GOODSELL D S , HALLIDAY R S , et al. Automated docking using a Lamarckian genetic algorithm and an empirical binding free energy function[J]. Journal of Computational Chemistry, 2015, 19 (14): 1639- 1662.
5 KRISHNANAND K N , GHOSE D . Glowworm swarm optimisation: a new method for optimising multi-modal functions[J]. International Journal of Computational I, 2009, 1 (1): 93- 119.
6 GAO Weifeng , LIU Sanyang . A modified artificial bee colony algorithm[J]. Computers & Operations Research, 2012, 39 (3): 687- 697.
7 MIRJALILI S . Dragonfly algorithm: a new metaheuristic optimization technique for solving single-objective, discrete, and multi-objective problems[J]. Neural Computing and Applications, 2016, 27 (4): 1053- 1073.
doi: 10.1007/s00521-015-1920-1
8 ALJARAH I , FARIS H , MIRJALILI S . Optimizing connection weights in neural networks using the whale optimization algorithm[J]. Soft Computing, 2018, 22 (1): 1- 15.
doi: 10.1007/s00500-016-2442-1
9 GONG Wenyin , CAI Zhihua . Parameter optimization of PEMFC model with improved multi-strategy adaptive differential evolution[J]. Engineering Applications of Artificial Intelligence, 2014, 27 (1): 28- 40.
10 XIONG Guojiang , SHI Dongyuan , DUAN Xianzhong . Multi-strategy ensemble biogeography-based optimization for economic dispatch problems[J]. Applied Energy, 2013, 111 (4): 801- 811.
11 WANG H , WU Z , RAHNAMAYAN S , et al. Multi-strategy ensemble artificial bee colony algorithm[J]. Information Sciences, 2014, 279 (1): 587- 603.
12 DU Wenlin , LI Bin . Multi-strategy ensemble particle swarm optimization for dynamic optimization[J]. Information Sciences, 2008, 178 (15): 3096- 3109.
13 LI M Q , XU L P , XU N , et al. SAR image segmentation based on improved grey wolf optimization algorithm and fuzzy c-means[J]. Mathematical Problems in Engineering, 2018, 2018 (10): 1- 11.
14 MIRJALILI S , MIRJALILI S M , LEWIS A D . Grey wolf optimizer[J]. Advances in Engineering Software, 2014, 69 (1): 46- 61.
15 马晓宁, 李笑含. 基于Tent混沌映射的可复制的鲸鱼算法[J]. 计算机仿真, 2022, 39 (8): 363- 368.
MA Xiaoning , LI Xiaohan . A replicable whale algorithm based on tent chaotic mapping[J]. Computer Simulation, 2022, 39 (8): 363- 368.
16 张晓凤, 王秀英. 灰狼优化算法研究综述[J]. 计算机科学, 2019, 46 (3): 30- 38.
ZHANG Xiaofeng , WANG Xiuying . Review of grey wolf optimization algorithms[J]. Computer Science, 2019, 46 (3): 30- 38.
17 MITTAL N, SINGH U, SOHI B S. Modified grey wolf optimizer for global engineering optimization[J/OL]. Applied Computational Intelligence and Soft Computing, 2016, 2016: 7950348[2022-10-10]. https://doi.org/10.1155/2016/7950348.
18 MALIK M R S, MOHIDEEN E R, ALI L. Weighted distance grey wolf optimizer for global optimization problems[C]//2015 IEEE International Conference on Computational Intelligence and Computing Research (ICCIC2015). Tirunelveli, India: Institute of Electrical and Electronics Engineers, 2015: 1-6.
[1] . Graph model based trustworthy resource scheduling algorithm in cloud environment [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(1): 63-74.
[2] XIE Jian-min, YAO Bing, ZHAO Ting-gang. An algorithm and its implementation for odd-elegant labeling of general sun graph Sm,n [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2016, 51(4): 79-85.
[3] ZHANG Chun-ying, WANG Li-ya, LIU Bao-xiang. Dynamic reduction theory for interval concept lattice based on covering and its realization [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(08): 15-21.
[4] SONG Le-hui,CHENG Yue-gui,PAN Meng . A hybrid evolutionary modeling algorithm for dynamic systems [J]. J4, 2008, 43(11): 27-30 .
[5] ZHENG Xiang-Wei,LIU Hong . A diversity-guided two stages multi-objective particle swarm optimizer [J]. J4, 2008, 43(11): 5-10 .
[6] YUAN Xiao-hang,DU Xiao-yong . iRIPPER: an improved rule-based text categorization algorithm [J]. J4, 2007, 42(11): 66-68 .
[7] FAN Jia-chen, WANG Ping-xin, YANG Xi-bei. Density-sensitive spectral clustering based on three-way decision [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2023, 58(1): 59-66.
[8] QI Ping, WANG Fu-cheng, WANG Bi-qing, LIANG Chang-yong. Dynamic level scheduling algorithm for cloud computing based on failure regularity-aware [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2019, 54(1): 103-115.
[9] ZHOU Peng, YI Jing, ZHU Zhen-fang, LIU Pei-yu. Fixed-radius nearest neighbor progressive competition algorithm for imbalanced classification [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2019, 54(3): 102-109.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] LI Yong-ming1, DING Li-wang2. The r-th moment consistency of estimators for a semi-parametric regression model for positively associated errors[J]. J4, 2013, 48(1): 83 -88 .
[2] DONG Li-hong1,2, GUO Shuang-jian1. The fundamental theorem for weak Hopf module in  Yetter-Drinfeld module categories[J]. J4, 2013, 48(2): 20 -22 .
[3] CHENG Li-qing1,2, SHI Qiao-lian2. A new hybrid conjugate gradient method[J]. J4, 2010, 45(6): 81 -85 .
[4] ZHAO Tong-xin1, LIU Lin-de1*, ZHANG Li1, PAN Cheng-chen2, JIA Xing-jun1. Pollinators and pollen polymorphism of  Wisteria sinensis (Sims) Sweet[J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(03): 1 -5 .
[5] MA Jian-ling . Spectral characterization of the rhomb-type achromatic retarder[J]. J4, 2007, 42(7): 27 -29 .
[6] CHENG Zhi1,2, SUN Cui-fang2, WANG Ning1, DU Xian-neng1. On the fibre product of Zn and its property[J]. J4, 2013, 48(2): 15 -19 .
[7] ZHANG Ai-ping,LI Gang . LRquasinormalEhresmann semigroups[J]. J4, 2006, 41(5): 44 -47 .
[8] HUO Yu-hong, JI Quan-bao. Synchronization analysis of oscillatory activities in a biological cell system[J]. J4, 2010, 45(6): 105 -110 .
[9] SHI Chang-guang . Multi-soliton solution of the Faddeev model[J]. J4, 2007, 42(7): 38 -40 .
[10] MA Jie-xiong,JIANG Li,QI Yu-yu,XIANG FEng-ning,XIA Guang-min . The growth of calli and regenerated plantlets of Gentiana Przewalskii Maxim. and the constituents analysis of its two effective components[J]. J4, 2006, 41(6): 157 -160 .