Shouwen Chen 1, 2 and Zhuoming Xu 1 and Yan Tang 1 and Shun Liu 2
Academic Editor:Roque J. Saltaren
1, College of Computer and Information, Hohai University, Nanjing, Jiangsu 210098, China
2, College of Mathematics and Information, Chuzhou University, Chuzhou, Anhui 239000, China
Received 10 April 2014; Accepted 23 July 2014; 30 December 2014
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1. Introduction
Particle swarm optimization (PSO) has been proposed originally by Kennedy and Eberhart in 1995, which is a population-based stochastic optimization techniques inspired by social behavior of bird flocking or fish schooling [1]. The PSO algorithm is easy to implement and has been empirically shown to perform well on many optimization problems [2-5]. The development of PSO can be classified into three categories in general. The first category emphasizes the variants of PSO mechanism itself, both in mathematics [6] and in topology [7]. The second one hybridizes other optimization techniques into PSO, such as ACO [8], GA [9, 10], Tabu [11], and simulated annealing [12]. The third one leverages the advantages of Chaos Maps, such as certainty, ergodicity, and stochastic property [13]. The hybridization of chaos with PSO has also become an active direction in recent research activities [14, 15].
With the concept of the center of gravity in Physics, Song et al. designed a centroid particle swarm based on each particle's best position and proposed a centroid PSO (CPSO) algorithm [16] to enhance individual and group collaboration and information sharing capabilities. Also, Gou et al. proposed an ACL-PSO algorithm [17] with a population centroid based on every particle's current position. Their experimental results showed that ACL-PSO algorithm improved the global searching capability and effectively avoided the premature convergence. Inertia weight [18], in the form of linear decreasing one, was embedded into the original PSO firstly by Shi and Eberhart [18]. Based on their work, a conclusion can be drawn that a large inertia weight facilitates a global search while a small one facilitates a local search. After that, different kinds of inertia weights were introduced and expressed in exponential formalities [19] or other nonlinear formalities [20-22]. Recently, Ting et al. [23] proposed an exponential inertia weight frame. After their carefully analysis of the effect of local attractor and global attractor, they presented suggestions for adjusting these attractors in order to improve the performance of PSO algorithm.
In this paper, in order to prevent PSO from falling in a local optimum, we propose an improved PSO algorithm (IPSO) based on two forms of exponential inertia weight proposed by Ting et al. and two kinds of centroids, population centroid and the best individual centroid, which are based on a new weighted linear combination of each particle's current position and a linear combination of each particle's best position, respectively. Therein, the proposed IPSO algorithm provides two velocity updating forms, which are selected by roulette wheel for every particle at each of evolution iterations. Besides one particle's own extreme position and the global extreme position, one of velocity updating forms is based on population centroid, another is based on best individual centroid. By means of comparing its optimization ability with other four PSO algorithms, the experiment results show that the proposed IPSO algorithm can reach more excellent optima.
The remainder of this paper is organized as follows. Basic PSO algorithm (BPSO) [18], exponential inertia weight PSO (EPSO) [23], center PSO (CPSO) [16], and self-adaptive comprehensive learning PSO (ACL-PSO) [17] are proposed in Section 2. We present our improved PSO algorithm (IPSO) model in Section 3. Section 4 shows our experimental results. Finally, we conclude our work in the last section.
2. Background
The basic PSO (BPSO) algorithm is a useful tool for optimization. Each particle's position stands for a candidate solution to the problem which will be solved. The BPSO, EPSO, CPSO, and ACL-PSO are proposed below.
2.1. BPSO
Denote [figure omitted; refer to PDF] and [figure omitted; refer to PDF] as the fitness function, the scale of swarm, and the maximum iteration number, respectively. Let [figure omitted; refer to PDF] , and [figure omitted; refer to PDF] be the position, velocity, and fitness of the [figure omitted; refer to PDF] particle at the [figure omitted; refer to PDF] iteration, respectively. In addition, let [figure omitted; refer to PDF] be the [figure omitted; refer to PDF] particle's best fitness and let [figure omitted; refer to PDF] be the corresponding position, St. [figure omitted; refer to PDF] . Also, let [figure omitted; refer to PDF] be the swarm's best fitness and let [figure omitted; refer to PDF] be the corresponding position, St. [figure omitted; refer to PDF] .
In BPSO, tracking the two positions [figure omitted; refer to PDF] and [figure omitted; refer to PDF] , each particle flows through the multidimensional searching space to update its inertia weight, velocity, and position according to (1)-(3), respectively. Consider [figure omitted; refer to PDF]
In (1), [figure omitted; refer to PDF] stands for inertia weight at the [figure omitted; refer to PDF] iteration and [figure omitted; refer to PDF] and [figure omitted; refer to PDF] are the initial inertia weight and the final inertia weight, respectively. In (2), [figure omitted; refer to PDF] and [figure omitted; refer to PDF] are accommodation parameters and random numbers [figure omitted; refer to PDF] and [figure omitted; refer to PDF] belong to the interval of 0 and 1.
2.2. EPSO
Inertia weight, which had been introduced firstly by Shi and Eberhart is one of the important parameters in PSO algorithm [18]. Since that, linearly decreasing inertia weight and nonlinearly decreasing one have been used widely in literature. Chauhan et al. [22] summarized different inertia weight forms. However, there is no clear justification of how this parameter can be adjusted to improve the performance of PSO algorithm. In [23], Ting et al. have investigated the property for an exponential inertia weight inspired by adaptive crossover rate (ACR) used in differential evolution algorithm. The ACR is defined as [figure omitted; refer to PDF] where [figure omitted; refer to PDF] is the initial crossover rate, [figure omitted; refer to PDF] is the current generation number, and [figure omitted; refer to PDF] is the maximum number of generations. The adaptive function for the crossover rate is simply crafted based on the logic of high [figure omitted; refer to PDF] at the beginning of run to prevent premature convergence and low [figure omitted; refer to PDF] at the end of run to enhance the local search. This concept is exactly the same for the case of inertia weight [figure omitted; refer to PDF] in BPSO. Thus, Ting et al. defined their exponential inertia weight as follows: [figure omitted; refer to PDF] where [figure omitted; refer to PDF] is the initial inertia weight. Parameters [figure omitted; refer to PDF] and [figure omitted; refer to PDF] in (5) are named the local search attractor and the global search attractor, respectively. On the one hand, while parameter [figure omitted; refer to PDF] is set to 1, parameter [figure omitted; refer to PDF] has the ability to push down the value of [figure omitted; refer to PDF] . On the other hand, when [figure omitted; refer to PDF] is increased, it has the ability to pull up the value of the [figure omitted; refer to PDF] . Note that when [figure omitted; refer to PDF] is zero, the inertia weight becomes a static value [figure omitted; refer to PDF] ; when [figure omitted; refer to PDF] is zero, it is in fact a static [figure omitted; refer to PDF] with the value approximate to 0.32 on condition that [figure omitted; refer to PDF] is set to 1.
Substituting (5) for (1) in BPSO algorithm, Ting et al. proposed an exponential inertia weight PSO algorithm (EPSO). After testing on 23 benchmark problems, they draw a conclusion from simulation results that EPSO algorithm was capable of global search and local search when [figure omitted; refer to PDF] and [figure omitted; refer to PDF] , respectively. At the same time, their simulation results showed that the EPSO algorithm had better performance in comparison to the BPSO algorithm, which was used widely in many significant works.
2.3. CPSO
In order to enhance interparticle cooperation and information sharing capabilities, Song et al. proposed a centroid PSO (CPSO) [16] algorithm. In CPSO algorithm, the centroid of particle swarm is defined as (6), and (2) in BPSO algorithm is substituted for (7). Consider [figure omitted; refer to PDF]
In the abovementioned (7), [figure omitted; refer to PDF] is a positive constant similar to [figure omitted; refer to PDF] and [figure omitted; refer to PDF] ; [figure omitted; refer to PDF] is a random number between 0 and 1.
In this way, the running track of each particle is not only interrelated with the individual best position and the best position of the swarm but also interrelated with the centroid of the whole particle swarm. Their experimental results show that the CPSO algorithm not only enhances the local searching efficiency and global searching performance but also has an ability to avoid the premature convergence problem effectively.
2.4. ACL-PSO
After introducing population centroid learning mechanism into the BPSO, Gou et al. proposed an ACL-PSO algorithm [17] based on self-adapted comprehensive learning. They defined the fitness proportion of the [figure omitted; refer to PDF] particle as (8), and the population centroid corresponding to the [figure omitted; refer to PDF] iteration as (9). Consider [figure omitted; refer to PDF]
Considering not only individual particle's own previous best position but also the evolution trend of populations, Gou et al. proposed two different evolution strategies at different evolution stage. Therein, [figure omitted; refer to PDF] was used to guide the particle searching direction at the early stage of evolution, while [figure omitted; refer to PDF] was used to guide its direction at the later stage. To realize these evolution strategies, they used (10) to compute [figure omitted; refer to PDF] at the [figure omitted; refer to PDF] iteration and used a random number [figure omitted; refer to PDF] between 0 and 1 to compare with [figure omitted; refer to PDF] . Consider [figure omitted; refer to PDF]
If [figure omitted; refer to PDF] , update the [figure omitted; refer to PDF] particle's velocity by (11) or (12), else by (2). Consider [figure omitted; refer to PDF]
In the abovementioned formulas, inertia weight of ACL-PSO was computed by the following: [figure omitted; refer to PDF] where random number [figure omitted; refer to PDF] was in [figure omitted; refer to PDF] . In addition, [figure omitted; refer to PDF] was the weight coefficient of population centroid and [figure omitted; refer to PDF] was a random number between 0 and 1. The item [figure omitted; refer to PDF] was the entry of population centroid learning, which reflected an individual particle's social cognition learning and thinking.
Compared to other four improved PSO algorithms in terms of accuracy, convergence speed, and computational complexity, ACL-PSO converged faster, resulting in more robust and better optima [17].
3. Our Proposed Method IPSO
Define particle fitness [figure omitted; refer to PDF] as object function directly, and adjust the IPSO algorithm search mechanism as follows: [figure omitted; refer to PDF]
Combining the above EPSO algorithm's inertia weight, CPSO and ACL-PSO algorithms' population centroids, taking advantage of ACL-PSO algorithm's evolution strategies, we propose a new improved PSO algorithm (IPSO) as follows.
(1) Population Centroid and Best Individual Centroid . So as to CPSO and ACL-PSO algorithms, their centroids are different from each other. The centroid of CPSO algorithm is a linear combination of each particle's best position, while the other is a weighted linear combination of each particle's current position. Taking advantage of the two centroids, we embed them into BPSO to balance PSO algorithm's performance of local and global searching, and denote them as best individual centroid and population centroid, respectively.
If [figure omitted; refer to PDF] is smaller, that is to say, the object value is smaller at the [figure omitted; refer to PDF] iteration, then the particle's current position makes more contribution to the construction of the population centroid corresponding to the same iteration. Thus, in order to show the degree of importance of the particle's current position, [figure omitted; refer to PDF] , in the population centroid, we define the [figure omitted; refer to PDF] particle proportion as (15) and compute the population centroid [figure omitted; refer to PDF] by (9). At the same time, (16) can be obtained naturally. Consider [figure omitted; refer to PDF]
Gou et al. [17] have proved that their proposed population centroid will be the convergence position on condition that ACL-PSO algorithm converges to local minimum or global convergence. Similarly, Let [figure omitted; refer to PDF] satisfy (17), (18) indicates that [figure omitted; refer to PDF] is the convergence position of IPSO algorithm on condition that the algorithm converges to local minimum or global convergence. Consider [figure omitted; refer to PDF]
(2) Inertia Weight . Based on the exponential inertia weight proposed by Ting et al. [23], we select two pairs of [figure omitted; refer to PDF] and [figure omitted; refer to PDF] . One pair is that [figure omitted; refer to PDF] is 1 and [figure omitted; refer to PDF] is 2. Another one is contrary. Denote them as (19) at the [figure omitted; refer to PDF] iteration. These inertia weights work in with different evolution strategies, the same as ACL-PSO algorithm. Consider [figure omitted; refer to PDF]
(3) Update Velocity . It can be drawn from (18) that population centroid will coincide with the population globally optimal location. So, we use Gou et al.'s method [17] to help the [figure omitted; refer to PDF] particle to select velocity updating formula. If [figure omitted; refer to PDF] , update the [figure omitted; refer to PDF] particle's velocity by (12) with [figure omitted; refer to PDF] , else by (7) with [figure omitted; refer to PDF] . The execution process of IPSO algorithm is shown in Algorithm 1.
Algorithm 1: The execution process of IPSO algorithm.
Algorithm. IPSO ( [figure omitted; refer to PDF] )
Input: [figure omitted; refer to PDF]
Output: [figure omitted; refer to PDF] , [figure omitted; refer to PDF]
Begin
(1) Initialize the parameters including population size [figure omitted; refer to PDF] , the dimension size [figure omitted; refer to PDF] ,
the maximum iteration number [figure omitted; refer to PDF] , and the current iterative count [figure omitted; refer to PDF]
(2) Generate the initial population and initial velocity. The initial population and initial velocity for each
particle are randomly generated as follows:
Population = [figure omitted; refer to PDF] [figure omitted; refer to PDF]
Velocity = [figure omitted; refer to PDF] [figure omitted; refer to PDF]
where [figure omitted; refer to PDF] and [figure omitted; refer to PDF] are the position and velocity at the [figure omitted; refer to PDF] th iteration for the [figure omitted; refer to PDF] th particle,
respectively. [figure omitted; refer to PDF] is the dimension of each particle. [figure omitted; refer to PDF] , [figure omitted; refer to PDF] are the minimum and maximum value of
each point belonging to the [figure omitted; refer to PDF] th dimension, respectively. [figure omitted; refer to PDF] , [figure omitted; refer to PDF] are the minimum and maximum
value of each point belonging to the [figure omitted; refer to PDF] th dimension, respectively.
(3) Calculate the fitness value of each particle and record it as: [figure omitted; refer to PDF] , record the [figure omitted; refer to PDF] th particle's
best fitness and its corresponding position [figure omitted; refer to PDF] as: [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , St. [figure omitted; refer to PDF] , [figure omitted; refer to PDF]
(4) Calculate the population's best fitness [figure omitted; refer to PDF] and its corresponding position [figure omitted; refer to PDF] , St. [figure omitted; refer to PDF]
(5) Compute population centroid by (9) with the weighted coefficient [figure omitted; refer to PDF] computed by (15),
compute best individual centroid by (6).
(6) For [figure omitted; refer to PDF] to [figure omitted; refer to PDF] do
(6.1) Compute [figure omitted; refer to PDF] by (10), and generate a random number [figure omitted; refer to PDF] ;
(6.2) If [figure omitted; refer to PDF]
(6.2.1) Update [figure omitted; refer to PDF] by (19)
(6.2.2) For [figure omitted; refer to PDF] to [figure omitted; refer to PDF] do
Update [figure omitted; refer to PDF] and [figure omitted; refer to PDF] by (12) and (3) respectively, compute [figure omitted; refer to PDF]
If [figure omitted; refer to PDF] then [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] ,
End If
End
Else
(6.2.3) Update [figure omitted; refer to PDF] by (19)
(6.2.4) For [figure omitted; refer to PDF] to [figure omitted; refer to PDF] do
Update [figure omitted; refer to PDF] and [figure omitted; refer to PDF] by (7) and (3) respectively, compute [figure omitted; refer to PDF]
If [figure omitted; refer to PDF] then [figure omitted; refer to PDF] , [figure omitted; refer to PDF] , [figure omitted; refer to PDF] ,
End If
End
End If
(6.3) Compute [figure omitted; refer to PDF] and set [figure omitted; refer to PDF] , where [figure omitted; refer to PDF]
(6.4) Compute population centroid by (9) with the weighted coefficient [figure omitted; refer to PDF] computed by (15),
compute best individual centroid by (6).
(6.5) let [figure omitted; refer to PDF]
End
(7) output [figure omitted; refer to PDF] , [figure omitted; refer to PDF]
End
4. Experiments and Results
In this section, BPSO, EPSO, CPSO, and ACL-PSO algorithm are compared as four benchmark functions to verify the feasibility of IPSO algorithm. The descriptions of those test functions, which can be divided into unimodal and multimodal function, are shown in Table 1. Using the object function in Table 1 to evaluate each particle's fitness, the smaller the function value the higher the fitness.
Table 1: Descriptions of four benchmark functions.
Function | Mathematical formula | Range and dim. | Minima | Characteristics |
Griewank [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | 0 | Highly multimodal |
Rastrigin [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | 0 | Highly multimodal |
Rosenbrock [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | 0 | Multiple local optima |
Sphere [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | 0 | Unimodal convex |
Experiments use the following methods. Firstly, determine the parameter pairs of IPSO algorithm, such as [figure omitted; refer to PDF] and [figure omitted; refer to PDF] . Secondly, fixing the number of iterations, with different number of particles, evaluate performances of those five algorithms by average object value corresponding to the [figure omitted; refer to PDF] iteration. At last, setting the maximum iteration number and different target accuracies of these functions, success rate and average convergence iteration number are compared.
The average object value according to the [figure omitted; refer to PDF] iteration of all the [figure omitted; refer to PDF] turns is as (20) for each algorithm. Therein, [figure omitted; refer to PDF] stands for the global best fitness for the [figure omitted; refer to PDF] algorithm, corresponding to the [figure omitted; refer to PDF] iteration at the [figure omitted; refer to PDF] turn. Consider [figure omitted; refer to PDF]
(1) Determine the Parameter Pairs of [figure omitted; refer to PDF] and [figure omitted; refer to PDF] in IPSO Algorithm . With [figure omitted; refer to PDF] , let [figure omitted; refer to PDF] be 10, 20, and 30 each time and let [figure omitted; refer to PDF] be updated by (21) for each [figure omitted; refer to PDF] . Run the IPSO algorithm [figure omitted; refer to PDF] turns per time. Its [figure omitted; refer to PDF] curves of above four benchmark functions are shown in Figures 1, 2, 3, and 4. Consider [figure omitted; refer to PDF]
[figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to different [figure omitted; refer to PDF] .
(a) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
(c) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
[figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to different [figure omitted; refer to PDF] .
(a) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
(c) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
Figure 3: [figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to different [figure omitted; refer to PDF] ; (a2) is a partial graph of (a1) corresponding to [figure omitted; refer to PDF] ; (b2) is a partial graph of (b1) corresponding to [figure omitted; refer to PDF] ; (c2) is a partial graph of (c1) corresponding to [figure omitted; refer to PDF] .
[figure omitted; refer to PDF]
[figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to different [figure omitted; refer to PDF] .
(a) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
(c) [figure omitted; refer to PDF]
[figure omitted; refer to PDF]
From Figures 1-4, we can get that each of the number of iteration, the value of parameter [figure omitted; refer to PDF] , and swarm size [figure omitted; refer to PDF] produces an effect on performance of IPSO algorithm. Let [figure omitted; refer to PDF] be 0.5 and let [figure omitted; refer to PDF] be 30; [figure omitted; refer to PDF] will arrive to minimum at later iteration stage for the four benchmark functions. If [figure omitted; refer to PDF] is zero, [figure omitted; refer to PDF] will tend to be infinite, and the corresponding curves about [figure omitted; refer to PDF] will not appear. The phenomenon happened in curves of functions [figure omitted; refer to PDF] and [figure omitted; refer to PDF] , that is to say, IPSO algorithm can find target optima value for [figure omitted; refer to PDF] and [figure omitted; refer to PDF] at litter iteration number, this can be seen in Figures 1 and 2. While [figure omitted; refer to PDF] is smaller, [figure omitted; refer to PDF] will become smaller too. From Figure 3, we can get that setting [figure omitted; refer to PDF] is more effective for using IPSO algorithm to search optimal solution of [figure omitted; refer to PDF] at later iteration stage. So as to [figure omitted; refer to PDF] , setting [figure omitted; refer to PDF] is reasonable.
With different pairs of swarm size and [figure omitted; refer to PDF] , [figure omitted; refer to PDF] comes to the minimum at the later iteration. Table 2 lists the disjoint distribution of [figure omitted; refer to PDF] pairs according to [figure omitted; refer to PDF] . Larger population sizes require more function evaluations and increase the computing efforts for convergence, but increase the reliability of the algorithm. The problem is to find a compromise between cost and reliability. In order to make IPSO algorithm have better optimization capability, we will set swarm size [figure omitted; refer to PDF] , and let parameter [figure omitted; refer to PDF] take the corresponding values in the third row of Table 2 for the different benchmark functions in the following tests with the algorithm.
Table 2: Disjoint distribution of [figure omitted; refer to PDF] for each function.
Function | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
[figure omitted; refer to PDF] | 10 | 0.5 | 1 | 2 | 0.5 |
20 | 0.5 | 0.5 | 2 | 0.5 | |
30 | 0.5 | 0.5 | 2 | 0.5 |
(2) Compare IPSO Algorithm with Other Four Algorithms . Using BPSO, EPSO, CPSO, and ACL-PSO algorithm to compare with IPSO algorithm on above four benchmark functions, we set [figure omitted; refer to PDF] , and other parameters related to those compared with algorithms are listed in Table 3.
Table 3: Parameter settings of different algorithms.
Algorithms | Parameters |
BPSO [18] | [figure omitted; refer to PDF] |
EPSO [23] | [figure omitted; refer to PDF] |
CPSO [16] | [figure omitted; refer to PDF] |
ACL-PSO [17] | [figure omitted; refer to PDF] |
IPSO | [figure omitted; refer to PDF] |
Run each of above five algorithms 50 times independently. Record five indicators, which are the minimum object value (Min), the maximum object value (Max), the mean object value (Mean), the deviation of object value (Dev), and the average convergence iteration number (ACIN), for every run. Ours experimental results are shown in Table 4 and Figures 5, 6, 7, and 8.
Table 4: Comparison of the results of different PSO algorithms ( [figure omitted; refer to PDF] = 20).
Functions | Algorithms | Indicators | ||||
Min | Mean | Max | Deviation | ACIN | ||
[figure omitted; refer to PDF] | BPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] |
EPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
CPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
ACL-PSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
IPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
| ||||||
[figure omitted; refer to PDF] | BPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] |
EPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
CPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
ACL-PSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
IPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
| ||||||
[figure omitted; refer to PDF] | BPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] |
EPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
CPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
ACL-PSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
IPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
| ||||||
[figure omitted; refer to PDF] | BPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] |
EPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
CPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
ACL-PSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | |
IPSO | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] | [figure omitted; refer to PDF] |
[figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to results of five algorithms. (b) is a partial graph of (a) corresponding to [figure omitted; refer to PDF] .
(a) [figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
[figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to results of five algorithms. (b) is a partial graph of (a) corresponding to [figure omitted; refer to PDF] .
(a) [figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
[figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to results of five algorithms. (b) is a partial graph of (a) corresponding to [figure omitted; refer to PDF] .
(a) [figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
[figure omitted; refer to PDF] curves of [figure omitted; refer to PDF] according to results of five algorithms. (b) is a partial graph of (a) corresponding to [figure omitted; refer to PDF] .
(a) [figure omitted; refer to PDF]
(b) [figure omitted; refer to PDF]
From Table 4, it can be seen that there are higher accuracy for the IPSO than that for the BPSO, EPSO, CPSO, and ACL-PSO. From the mean and deviation in Table 4, the IPSO is better than the BPSO, EPSO, CPSO, and ACL-PSO, with a steady convergence. In addition, IPSO algorithm needs less number of iteration for coming to convergence than ACL-PSO algorithm does.
Let iteration number [figure omitted; refer to PDF] be as [figure omitted; refer to PDF] -coordinate and average best object value according to the [figure omitted; refer to PDF] iteration [figure omitted; refer to PDF] as [figure omitted; refer to PDF] -coordinate; we plot [figure omitted; refer to PDF] curves of these four benchmark functions in Figures 5-8.
From Figures 5-8, it can be seen that IPSO algorithm, compared to other four algorithms, searches more excellent object value at later iteration. EPSO, CPSO, ACL-PSO, and IPSO algorithm all can find more optimal solution than BPSO algorithm does after litter iteration number, and the performance of IPSO algorithm is the best among these five algorithms. In addition, IPSO algorithm needs less iteration number to come to convergence than ACL-PSO algorithm does. This phenomenon is considered to be due to the combination of inertia weight [figure omitted; refer to PDF] working with population centroid and [figure omitted; refer to PDF] working with best individual centroid.
(3) Compare Success Rate and Average Number of Iteration Corresponding to Target Precision Arriving . In order to validate the effectiveness of IPSO algorithm further, we set target precisions of above four benchmark functions, which are listed in Table 5 and run every algorithm 100 turns for each of test functions, respectively. Setting the maximum iteration number as 1000, while object value is less than or equal to its target precision, we plus success convergence number with one and record the current iteration number at one turn. Then, success rate (SR) is equal to the success convergence number divided by total number of turns. Average convergence iteration (ACI) is the mean iteration numbers at all. Four group target accuracies, which are listed in Table 5, are used to evaluate the stability of those algorithms. Our experimental results are shown in Table 5.
Table 5: Comparison of experimental results of success rate and average convergence iteration.
Functions | Target precision | Indicators | BPSO | EPSO | CPSO | ACL-PSO | IPSO |
[figure omitted; refer to PDF] | 0.1 | SR | 0 | 0 | 0 | 86 | 100 |
ACI | 1000 | 1000 | 1000 | 901.28 | 676.66 | ||
0.6 | SR | 0 | 4 | 0 | 98 | 100 | |
ACI | 1000 | 974.80 | 1000 | 747.80 | 607.82 | ||
1 | SR | 0 | 26 | 4 | 100 | 100 | |
ACI | 1000 | 818.98 | 989.02 | 439.76 | 472.48 | ||
1.5 | SR | 16 | 100 | 64 | 100 | 100 | |
ACI | 991.82 | 327 | 840.20 | 23.10 | 269.64 | ||
| |||||||
[figure omitted; refer to PDF] | 10-7 | SR | 0 | 0 | 0 | 48 | 100 |
ACI | 1000 | 1000 | 1000 | 967.60 | 369.50 | ||
20 | SR | 0 | 4 | 0 | 100 | 100 | |
ACI | 1000 | 971.98 | 1000 | 40.92 | 262.20 | ||
30 | SR | 0 | 8 | 2 | 100 | 100 | |
ACI | 1000 | 945.64 | 998.82 | 42.42 | 267.02 | ||
40 | SR | 2 | 30 | 12 | 100 | 100 | |
ACI | 982.44 | 775.36 | 944.36 | 28.62 | 254.82 | ||
| |||||||
[figure omitted; refer to PDF] | 19 | SR | 0 | 0 | 0 | 70 | 100 |
ACI | 1000 | 1000 | 1000 | 932.98 | 336.88 | ||
30 | SR | 0 | 8 | 0 | 100 | 100 | |
ACI | 1000 | 947.38 | 1000 | 444.52 | 275.40 | ||
40 | SR | 0 | 14 | 2 | 100 | 100 | |
ACI | 1000 | 903.22 | 983 | 333.14 | 608.60 | ||
500 | SR | 6 | 62 | 30 | 100 | 100 | |
ACI | 998.72 | 530.34 | 943.50 | 36.18 | 28.40 | ||
| |||||||
[figure omitted; refer to PDF] | 10-10 | SR | 0 | 0 | 0 | 78 | 100 |
ACI | 1000 | 1000 | 1000 | 920.22 | 330.84 | ||
0.01 | SR | 0 | 10 | 8 | 100 | 100 | |
ACI | 1000 | 932.46 | 980.92 | 229.54 | 255.48 | ||
0.05 | SR | 4 | 68 | 18 | 100 | 100 | |
ACI | 998.68 | 504.24 | 959.46 | 53.96 | 225.48 | ||
0.1 | SR | 14 | 90 | 52 | 100 | 100 | |
ACI | 994 | 294.80 | 899.40 | 34.64 | 209.18 |
From Table 5, it can be seen that success rates of those five algorithms are affected by the target accuracies. Success rate of IPSO algorithm for each test function reaches 100% and is obviously better than those of BPSO, EPSO, CPSO, and ACL-PSO. Average convergence iteration of EPSO is smaller than CPSO algorithm, and ACI of IPSO is smaller than EPSO algorithm. So as to the ability of finding optimal solutions, ACL-PSO algorithm is better than BPSO, EPSO, and CPSO algorithms, and IPSO algorithm is better than ACL-PSO algorithm.
5. Conclusions
The particle swarm optimization algorithm is a global stochastic tool, which has ability to search the global optima. PSO algorithm is easily trapped into local optima. In this paper, in order to overcome the shortcoming of PSO algorithm, we propose an improved particle swarm optimization algorithm (IPSO) based on two forms of exponential inertia weight and two kinds of centroids. By means of comparing optimization ability of IPSO algorithm with BPSO, EPSO, CPSO, and ACL-PSO algorithms, experimental results of these four benchmark functions show that the proposed IPSO algorithm is more efficient and outperforms other PSO algorithms in accuracy investigated in this paper. Inertia weight is one of the important parameters in PSO algorithm. How can (5) respect the constraint on [figure omitted; refer to PDF] ? Moreover can [figure omitted; refer to PDF] and [figure omitted; refer to PDF] be chosen to be adaptive throughout a single evolution to guarantee a suitable trade-off between exploration and exploitation phase? These are good future research directions for us.
Acknowledgments
This work is supported by the Natural Science Foundation of Jiangsu Province of China under Grant No. BK20140857 and Grant No. BK20141420. It was also supported by the Excellent Young Talents Fund of Anhui Province of China (Grant No. 2012SQRL154).
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
[1] J. Kennedy, R. Eberhart, "Particle swarm optimization," in Proceedings of the IEEE International Conference on Neural Networks, vol. 4, pp. 1942-1948, Perth, Australia, December 1995.
[2] J. J. Liang, A. K. Qin, P. N. Suganthan, S. Baskar, "Comprehensive learning particle swarm optimizer for global optimization of multimodal functions," IEEE Transactions on Evolutionary Computation , vol. 10, no. 3, pp. 281-295, 2006.
[3] S. Ding, H. Jiang, J. Li, G. Tang, "Optimization of well placement by combination of a modified particle swarm optimization algorithm and quality map method," Computational Geosciences , vol. 18, no. 5, pp. 747-762, 2014.
[4] A. Khare, S. Rangnekar, "A review of particle swarm optimization and its applications in Solar Photovoltaic system," Applied Soft Computing Journal , vol. 13, no. 5, pp. 2997-3006, 2013.
[5] S. Alam, G. Dobbie, Y. S. Koh, P. Riddle, S. Ur Rehman, "Research on particle swarm optimization based clustering: a systematic review of literature and techniques," Swarm and Evolutionary Computation , vol. 17, pp. 1-13, 2014.
[6] M. Clerc, J. Kennedy, "The particle swarm-explosion, stability, and convergence in a multidimensional complex space," IEEE Transactions on Evolutionary Computation , vol. 6, no. 1, pp. 58-73, 2002.
[7] S. Janson, M. Middendorf, "A hierarchical particle swarm optimizer and its adaptive variant," IEEE Transactions on Systems, Man, and Cybernetics Part B: Cybernetics , vol. 35, no. 6, pp. 1272-1282, 2005.
[8] M. S. Kiran, M. Gündüz, Ö. K. Baykan, "A novel hybrid algorithm based on particle swarm and ant colony optimization for finding the global minimum," Applied Mathematics and Computation , vol. 219, no. 4, pp. 1515-1521, 2012.
[9] X. H. Shi, Y. C. Liang, H. P. Lee, C. Lu, L. M. Wang, "An improved {GA} and a novel PSO-GA-based hybrid algorithm," Information Processing Letters , vol. 93, no. 5, pp. 255-261, 2005.
[10] J. Robinson, S. Sinton, Y. R. Samii, "Particle swarm, genetic algorithm, and their hybrids: optimization of a profiled corrugated horn antenna," in Proceedings of the IEEE International Symposium in Antennas and Propagation Society, vol. 1, pp. 314-317, San Antonio, Tex, USA, 2002.
[11] Q. Shen, W.-M. Shi, W. Kong, "Hybrid particle swarm optimization and tabu search approach for selecting genes for tumor classification using gene expression data," Computational Biology and Chemistry , vol. 32, no. 1, pp. 52-59, 2008.
[12] H.-L. Shieh, C.-C. Kuo, C.-M. Chiang, "Modified particle swarm optimization algorithm with simulated annealing behavior and its numerical verification," Applied Mathematics and Computation , vol. 218, no. 8, pp. 4365-4383, 2011.
[13] B. Alatas, E. Akin, A. B. Ozer, "Chaos embedded particle swarm optimization algorithms," Chaos, Solitons and Fractals , vol. 40, no. 4, pp. 1715-1734, 2009.
[14] C.-H. Yang, S.-W. Tsai, L.-Y. Chuang, C.-H. Yang, "An improved particle swarm optimization with double-bottom chaotic maps for numerical optimization," Applied Mathematics and Computation , vol. 219, no. 1, pp. 260-279, 2012.
[15] C. Li, J. Zhou, P. Kou, J. Xiao, "A novel chaotic particle swarm optimization based fuzzy clustering algorithm," Neurocomputing , vol. 83, pp. 98-109, 2012.
[16] S. L. Song, L. Kong, J. J. Cheng, "A novel particle swarm optimization algorithm model with centroid and its application," International Journal of Intelligent Systems and Applications , vol. 1, no. 1, pp. 42-49, 2009.
[17] J. Gou, Z. Y. Wu, J. Wang, "An improved particle swarm optimization algorithm based on self-adapted comprehensive learning," Advanced Science Letters , vol. 11, no. 1, pp. 668-675, 2012.
[18] Y. Shi, R. C. Eberhart, "Empirical study of particle swarm optimization," in Proceedings of the Congress on Evolutionary Computation (CEC '99), vol. 3, pp. 1945-1950, Washington, DC, USA, July 1999.
[19] C. Guimin, H. Xinbo, J. Jianyuan, M. Zhengfeng, "Natural exponential inertia weight strategy in particle swarm optimization," in Proceedings of the 6th World Congress on Intelligent Control and Automation (WCICA '06), vol. 1, pp. 3672-3675, June 2006.
[20] J. C. Bansal, P. K. Singh, M. Saraswat, A. Verma, S. S. Jadon, A. Abraham, "Inertia weight strategies in particle swarm optimization," in Proceedings of the 3rd World Congress on Nature and Biologically Inspired Computing (NaBIC '11), pp. 633-640, Salamanca, Spain, October 2011.
[21] A. Nickabadi, M. M. Ebadzadeh, R. Safabakhsh, "A novel particle swarm optimization algorithm with adaptive inertia weight," Applied Soft Computing Journal , vol. 11, no. 4, pp. 3658-3670, 2011.
[22] P. Chauhan, K. Deep, M. Pant, "Novel inertia weight strategies for particle swarm optimization," Memetic Computing , vol. 5, no. 3, pp. 229-251, 2013.
[23] T. O. Ting, Y. H. Shi, S. Cheng, S. Lee, "Exponential inertia weight for particle swarm optimization," in Proceedings of the 3rd International Conference on Advances in Swarm Intelligence, vol. 1, pp. 83-90, June 2012.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright © 2014 Shouwen Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract
Particle swarm optimization algorithm (PSO) is a global stochastic tool, which has ability to search the global optima. However, PSO algorithm is easily trapped into local optima with low accuracy in convergence. In this paper, in order to overcome the shortcoming of PSO algorithm, an improved particle swarm optimization algorithm (IPSO), based on two forms of exponential inertia weight and two types of centroids, is proposed. By means of comparing the optimization ability of IPSO algorithm with BPSO, EPSO, CPSO, and ACL-PSO algorithms, experimental results show that the proposed IPSO algorithm is more efficient; it also outperforms other four baseline PSO algorithms in accuracy.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer