This study suggests a new metaheuristic algorithm for global optimiza-tion, based on parallel hybridizing the swarm optimization PSO and Gravitational search algorithm GSA.. Keywords: Pa
Trang 1Hybrid PSOGSA Algorithm
Jie Yu
1
, Hong-Jiang Wang
2 , Jeng-Shyang Pan
3 , Kuo-Chi Chang
2,4,5 , Truong-Giang Ngo
, and Trong-The Nguyen
2,7
1
College of Mechanical and Automotive Engineering, Fujian University of Technology, Fuzhou
350118, China 2
Fujian Provincial Key Laboratory of Big Data Mining and Applications, Fujian University of
Technology, Fuzhou, China albertchangxuite@gmail.com, jvnthe@gmail.com
3
College of Computer Science and Engineering, Shandong University of Science and
Technology, Shandong, China 4
College of Mechanical and Electrical Engineering, National Taipei University of Technology,
Taipei, Taiwan 5
Department of Business Administration, North Borneo University College, Sabah, Malaysia
6
Thuyloi University, 175 Tay Son, Dong Da, Hanoi, Vietnam
giangnt@tlu.edu.vn 7
Haiphong University of Management and Technology, Haiphong, Vietnam
Abstract This study suggests a new metaheuristic algorithm for global
optimiza-tion, based on parallel hybridizing the swarm optimization (PSO) and Gravitational search algorithm (GSA) Subgroups of the population are formed by dividing the swarm’s community Communication between the subsets can be developed by
adding strategies for the mutation Twenty-three benchmark functions are used to test its performance to verify the feasibility of the proposed algorithm Compared
with the PSO, GSA, and parallel PSO (PPSO), the findings of the proposed algo-rithmreveal thatthe proposedPPSOGSAachieveshigher precisionthan other
competitor algorithms
Keywords: Parallel PSOGSA algorithm · Mutation strategy · Particle swarm
optimization · Gravitational search algorithm
1 Introduction
Nowadays,the metaheuri sticalgorithms havebeenused inmany industries,such as power, transportation, aviation [1], and other fields There are three kinds of metaheuris-tic algorithms inspired by nature: those generated by natural physical phenomena, those generated by biological evolution, and those generated by the living habits of populations Now, there are many representative algorithms in each of them, such as gravitational search algorithm(GSA) [2], simulated annealing algorithm (SA) [3] and black hole (BH)
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2021
J.-S Pan et al (eds.), Advances in Intelligent Information Hiding and Multimedia
Signal Processing, Smart Innovation, Systems and Technologies 212,
Trang 2[4], which are beneficial representatives inspired by natural physical manifestation Dif-ferential evolution (DE) [5] and genetic algorithms (GA) [6] are metaheuristic algorithms inspired by biological evolution process in nature, and more metaheuristic algorithms are generated by living habits of the population For example, PSO [7], gray wolf algorithm (GWO) [8], firefly algorithm (FA) [9], and whale optimization algorithm (WOA) [10] are powerful algorithm representatives For most of the metaheuristic algorithms, this method has a slower convergence rate and is prone to locally optimal solutions Many researchers have proposed a variety of hybrid algorithms to improve this phenomenon, mainly using the advantages of integrating different algorithms to enhance the mining ability, whether it is partial or comprehensive Its performance is better than the single optimization algorithm beforemixing [11–13] For example,in 2010,Seyedali Mir-jalili and others proposed a hybrid algorithm of PSO and GSA (PSOGSA) [14], which combines the advantages of PSO and GSA, making its performance superior to the orig-inal PSO andGSA algorithm Some scholars have proposed the clustering algorithm
to improve performance For example, the parallel particle swarm optimization (PPSO) algorithm [15–17] proposed by scholars, the leapfrog algorithm of clustering, is a good representative However, the idea of mixing and clustering helps improve the algorithm’s performance, and this study put forward a new optimization algorithm based on a par-allel hybridPSOGSA algorithm brought addingoptimal solution’smutation strategy
to enhance the algorithm’s performance 23 benchmark functions are selected for the performance test of the improved algorithm and compared with related four algorithms
2 Three Standard Algorithms
The PSO mathemat ical is as follows:
v d
i (t + 1) = ωv
d
i(t ) + c1· rand · pbest
d
d
+ c2· rand · g best − x
d
x d
i(t + 1) = x
d
i(t ) + v d
In the formula, v
d
i(t ) indicates that the ith particle d (d∈ {1, 2, , D}, then defined spatial dimension) dimension speed is D, the ith particle current position is x
d
i(t ), the inertia weight is ω, furthermore c, c2are two constants, and the random number is rand
in t he range of [0, 1], g best is the optimal solution currently obtained
Themathematical model of the GSA algorithm canbe expressedby aseries of formulas as follow Formulas (3)–(5):
vel d
i(t + 1) = rand · vel
d
i(t ) + a d
S d
i (t + 1) = S
d
i (t ) + vel
d
Trang 3The ith particle position where S
d
i(t ) represents, the ith particle velocity is vel
d
i(t ), a
d
i(t ) is the acceleration of the ith particle, M
d
i (t ) is the inertia mass of the ith particle, the ith particle resultant force is F
d
i (t ) The calculation method of inertia mass is shown
in Eqs (6) and (7)
mi(t ) =
fiti(t ) − worst(t ) best(t ) − worst(t )
(6)
Mi(t ) = mi(t )/
N
j=1
where the ith particle fitness function value represents fiti(t ), best(t ) represents the optimal global value obtained currently, and the worst fitness value currently obtained
is worst(t ), the total number of particles is N With inertial mass, the interaction force between particles can be expressed as:
F d
ij(t ) =
G(t ) · Mi(t ) · Mj(t )
· X d
j (t ) − X
d
The F
d
ij(t ) represents the d -dimension gravity between particles i and j, Rij(t ) rep-resents the Euclidean distance between particles i and j, ε is a constant, G (t ) is constant
of gravity, and its expression is shown in Eq (9)
where α and G0are constant, the set a maximum number of iteration is smaxt With the support of the above mathematical formula, the expression of the resultant force is:
F d
i (t ) =
N
j=1,j =i randj· F
d
The formula of the PSOGSA algorithm can be expressed by Formulas (11) and (12)
V d
i (t + 1) = ω V
d
i (t ) + c
1· rand1· a
d
+ c
2· rand2· g best
d
− X d
X d
i (t + 1) = X
d
i (t ) + V
d
where c
2are constant, rand1, rand2are random number belonging to [0, 1], the inertia weight coefficient is ω , and the ith particle velocity represents V
d
i (t ), the ith particle acceleration is a
d
i(t ), and its calculation method is the same as that in GSA X
d
i (t ) is the position, i particles in the d dimension under the number iterations t The current optimal solution represents g best
d
Trang 43 Parallel PSOGSA
This paper proposes a parallel PSOGSA The idea is to divide N (all particle individuals) into G subgroups, let G subgroups runPSOGSA independentlytofind the optimal value, and let Gsubgroups communicate with each other under a specific number of iterations, so as toshowthe advantagesof cooperationbetween subsets,so thatthe subgroups can continuously update to the high-quality solution In this study, there are two communicationstrategies for subgroupcommunication, andthe aforementioned strategies are all triggered by a specific number of iterations Strategy (1): If you want thealgorithm to quickly jump out of the local optimal solutiontrap, a mutation idea
is proposed, which is far away from the optimal global solution In each subgroup, the same numbers of individuals are randomly selected for mutation In the case of a specific number of iterations triggered by R, update according to Formula (13)
Xij= X
g best k
where Xij(i is a randomly selected individual i ∈ {1, 2, , sizepop/G } in the subgroups, sizepop is the maximum number of particles, j ∈ {1, 2, , dim}, dim is the dimension
of search space) is the current solution of the selected individual, X
g best
dimension value of the currently obtained global optimal solution (k ∈ {1, 2, , dim}),
γ is a number in the range [0.1, 1]
Strategy (2): choose two subgroups Gaand Gbarbitrarily fromthe G subgroups, and when R1 is triggered bya specificnumber of iterations, Ga,Gb communicates with the remaining Gk(k∈ {a, b}) about the optimal value and optimal solution The/ communication method is shown in Fig.1
Fig 1 Schematic diagram of communication strategy 2
In Fig.1, k is the number of the current iteration, R is the trigger condition, max is the maximum number of cycles, Gi (i ∈ {1, 2, , n}) represents the subgroups, n is the groups’ average number, G1and G3are two selected subgroups The optimal global solution of subgroups is perturbed in a small range of variation, which can further expand the search scope of the avoid falling and algorithm into local optimum The method is
Trang 5shown in Formulas (14) and (15):
Wd(t ) =
⎛
⎝ Popsize
i =1 V d
i (t )/PopSize
⎞
g best
∗
d(t ) = g best
∗
where: V
d
i (t ) and popsize are the same as before, N (0, 1) is the standard normal distribution function, g best
∗
d(t ) is the optimal solution currently obtained by the subgroups
Taking group G = 4 as an example, Fig.2shows the process of PPSOGSA algorithm using two communication strategies, where k is set to R that is the starting condition of the first strategy, k= R1is the starting condition of the second strategy, and k = max
is the endingcondition of the algorithm cycle Thatis to say, every R iterations of the algorithm, PPSOGSA use strategy (1) for communication, and every R1iterations, PPSOGSA uses strategy (2) for communication
Fig 2 Take the grouping into four groups as an example, the communication method of PPSOGSA
The PPSOGSA algorithm pseudo-code is shown in Table1
4 E xperimental Data
ThePPSOGSA algorithmperformance istested by23 benchmark functions In this experiment, the objective function is each function minimum value in the corresponding range The parameters of various optimization algorithms are set as follows For PSO, the following settings are used: C1= C2= 1.5, ω is linearly reduced from 0.9 to 0.2, the maximum speed vmax = 5, vmin = −5 GSA uses the following settings: G0 = 100 ,
α =23 For PSOGSA, c
1 =0.5, c
2 =1.5, G =1, α =23, ω is a random number
of [0, 1] For PPSO, C1= C2= 1.5, ω decreases linearly from 0.9 to 0.2, vmax = 5, v
min = −5 For PPSOGSA, the following settings are used: G = 1, α = 23, ω is [0, 1] random number, divided into four groups, R = 10, R1= 50 The maximum number of
Trang 6Table 1 Pseudo-code of PPSOGSA
Initialization:
Initialize Nparticles and divide them into G groups randomly and evenly, the largest
generation max iteration, set communication strategy trigger conditions R and R1 Initialization
of gravitational constant, inertia mass, acceleration
Iteration:
1: While T < max iteration do
2: Update the gravitational constant through Formula (9)
3: For groups = 1 to G do
4: For i = 1 to N /G do
5: Calculate the fitness function value of each particle
6: Update global optimal solutions and optimal values of the subgroups and the whole
population
7: end For
8: Update the inertia mass, gravity, acceleration particle speed and position
9: According to the updated particle velocity, using Formulas (14) and (15) to update the global optimal disturbance momentum
10: end For
11: If T is an integral multiple of R, use strategy 1 for communication
12: If T is an integral multiple of R1, use strategy 2 for communication
13: T = T + 1
14: end While
of this experiment is that each algorithm runs 20 times independently, and the average value of 20 experimental data is obtained to the experimental result, which is shown in Table2
The bold numbers in Table2are the best values obtained, divided into the best average value and the best optimal value According to the statistical analysis of Table2, for the best average value: PSO, GSA, PSOGSA, PPSO, PPSOGSA, the number of functions with the best performance is 5, 7, 6, 7, 17, respectively For the best optimal value in the test process: PSO, GSA, PSOGSA, PPSO, PPSOGSA, the number of functions with thebest performanceis10,11,12,10,18,respectively From thestatistical data,as
a result, we confirmthat theoverallperformance ofPPSOGSA is better than PPSO, GSA, PSOGSA, and PSO Under the objective function of the multi-dimensional, the function value accuracy is higher, and it is closer to the function optimal value Figure3
is the convergence curve of some selected benchmark functions After comparing the convergence curves of each algorithm, it canbe found that the PPSOGSA algorithm proposed in this paper has a faster convergence speed and its performance is better than the four optimization algorithms compared in the figure
Trang 7T
Trang 8T
Trang 9F1 F5
Fig 3 The convergence curve of some selected functions
Trang 105 Conclusion
In this study,we introduced aparallelPSOGSA algorithmbased onhybridizingthe PSOGSA algorithm and using the idea of clustering The concept of mutation and the interactionbetweensubgroups are used tomake the algorithmapproach tothe opti-mal value Twenty-three benchmarkfunctions are used for evaluating the PPSOGSA algorithm performance The obtained results are compared with PSOGSA, GSA, PSO, and PPSOalgorithm showsthat the proposedPPSOGSAalgorithm providesoverall performance that is better than the other four optimization algorithms
Acknowledgements Thiswork was supportedin part byFujian provincialbusesand special
vehicles R&D collaborative innovation center project (Grant Number: 2016BJC012)
References
1 Nguyen, T.T., Pan, J.S., Dao, T.K.: An improved flower pollination algorithm for optimizing
layouts of nodes in wireless sensor network IEEE Access 7, 75985–75998 (2019)
2 Rashedi, E., Nezamabadi-Pour, H., Saryazdi, S.: GSA: a gravitational search algorithm Inf Sci 179(13), 2232–2248 (2009)
3 Van Laarhoven, P.J., Aarts, E.H.: Simulated annealing In: Simulated Annealing: Theory and
Applications, pp 7–15 Springer, Berlin (1987)
4 Hatamlou, A.: Black hole: a new heuristic optimization approach for data clustering Inf Sci
222, 175–184 (2013)
5 Price, K.V.: Differential evolution.In: Handbookof Optimization,pp 187–214 Springer,
Berlin (2013)
6 Kennedy, J., Eberhart, R.: Particle swarm optimization In: Proceedings of ICNN’95-International Conference on Neural Networks, vol 4, pp 1942–1948 IEEE (1995)
7 Shi, Y.: Particle swarm optimization: developments, applications and resources In: Proceed-ings of the 2001 Congress On Evolutionary Computation (IEEE Cat No 01TH8546), (2001), vol 1, pp 81–86 IEEE
8 Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer Adv Eng Software 69 (2014)
9 Yang, X.-S.: Firefly algorithm In: Nature-Inspired Metaheuristic Algorithms, vol 20, pp 79–
90 (2008)
10 Mirjalili, S., Lewis, A.: The whale optimization algorithm Adv Eng Software 95 (2016)
11 Esmin, A., Lambert-Torres,G., Alvarenga,G.B.: Hybrid evolutionary algorithmbased on
PSO and G.A mutation In: Sixth International Conference on Hybrid Intelligent Systems (HIS’06), 2006, pp 57–57 IEEE (2006)
12 Nguyen, T.-T., Qiao, Y., Pan, J.-S., Chu, S.-C., Chang, K.-C., Xue, X., Dao, T.-K.: A hybridized
parallel bats algorithm for combinatorial problem of traveling salesman J Intell Fuzzy Syst Preprint 1–10 (2020).https://doi.org/10.3233/jifs-179668
13 Nguyen, T.-T., Pan, J.-S., Chu, S.-C., Roddick, J.F., Dao, T.-K.: Optimization localization in
wireless sensor network based on multi-objective firefly algorithm J Netw Intell 1, 130–138 (2016)
14 Mirjalili, S., Hashim, S.Z.M.: A new hybrid PSOGSA algorithm for function optimization
In: 2010 International Conference on Computer and Information Application, pp 374–377 IEEE (2010)
15 Chang, J.-F.,Roddick,J.F.,Pan,J.-S.,Chu,S.-C.: Aparallelparticleswarm optimization