Abstract
Because of the No Free Lunch (NFL) rule, we are still under the way developing new algorithms and improving the capabilities of the existed algorithms. Under consideration of the simple and steady convergence capability of the sine cosine algorithm (SCA) and the fast convergence rate of the Harris Hawk optimization (HHO) algorithms, we hereby propose a new hybridization algorithm of the SCA and HHO algorithm in this paper, called the CSCAHHO algorithm henceforth. The energy parameter is introduced to balance the exploration and exploitation procedure for individuals in the new swarm, and chaos is introduced to improve the randomness. Updating equations is redefined and combined of the equations in the SCA and HHO algorithms. Simulation experiments on 27 benchmark functions and CEC 2014 competitive functions, together with 3 engineering problems are carried out. Comparisons have been made with the original SCA, HHO, Archimedes optimization algorithm (AOA), Seagull optimization algorithm (SOA), Sooty Tern optimization algorithm (STOA), Arithmetic optimizer (AO) and Chimp optimization algorithm (ChOA). Simulation experiments on either unimodal or multimodal, benchmark or CEC2014 functions, or real engineering problems all verified the better performance of the proposed CSAHHO, such as faster convergence rate, low residual errors, and steadier capability. Matlab code of this algorithm is shared in Gitee with the following address: https://gitee.com/yuj-zhang/cscahho.
Citation: Zhang Y-J, Yan Y-X, Zhao J, Gao Z-M (2022) CSCAHHO: Chaotic hybridization algorithm of the Sine Cosine with Harris Hawk optimization algorithms for solving global optimization problems. PLoS ONE 17(5): e0263387. https://doi.org/10.1371/journal.pone.0263387
Editor: Seyedali Mirjalili, Torrens University Australia, AUSTRALIA
Received: November 22, 2021; Accepted: January 19, 2022; Published: May 19, 2022
Copyright: © 2022 Zhang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: Matlab code of this algorithm is shared in Gitee with the following address: (https://gitee.com/yuj-zhang/cscahho).
Funding: [1] The scientific research team project of Jing Chu University of technology with grant number TD202001. [2] National Training Program of Innovation and Entrepreneurship for Undergraduates with grant number 202111336006. [3] The key research and development project of Jing men with grant numbers 2019YFZD009. [4] Provincial teaching reform research project of Hubei universities with grant number 2020683.
Competing interests: The authors have declared that no competing interests exist.
1 Introduction
Thanks to the development of science and technology, we human are now facing more and more complicated and detailed world. More and more functions we formulated to describe the problems in real world are difficult to find the solutions. In order to obtain convinced solutions, we have developed a new kind of algorithms, called nature-inspired algorithms [1], to find the best solutions with random access. But there is a famous theory: no free lunch (NFL) [2], there would be a point balancing the capability and complexity. And we are all busy with finding the best way to solve the problems with easy. Lots of algorithms have been proposed by now, most of them could be classified into four types: evolutionary, human-based, physics-based, and swarm-based algorithms [3], as shown in Table 1.
[Figure omitted. See PDF.]
Although lots of algorithms have been proposed in literature, lots of problems [49] remain being difficult to solve [50]. Therefore, we are still under the way to find more capable algorithms, even improvements to increase the capability slightly.
Considering the simple and better performance of the sine cosine algorithm (SCA) [51], and the fast convergence capability of the Harris Hawk optimization (HHO) [52] algorithm, we propose a hybridized algorithm with chaos, called chaotic hybridization of the SCA and HHO algorithm (CSCAHHO) in this paper. The new algorithm would embrace the advantages of the two algorithms and better performance would be promised.
The rest of this paper would be scheduled as follows: in Section 2, literal review would be made on the algorithms and defects would be discussed. The CSCAHHO algorithm would be proposed in Section 3 and simulation experiments would be carried out in Section 4. Discussions would be made and conclusions would be drawn in Section 5.
2 Preliminaries
2.1 The SCA and its defects
The SCA is very simple, individuals in the SCA swarms would choose an equation to update their positions randomly. And the equations are related to sine and cosine functions, as formulated as follows:(1)(2)(3)Where X(t+1) and X(t) represent the positions of individuals in the t and t+1 iteration. r2 is a random number in an interval of 0 and 2π, while r3 are random numbers in an interval of 0 and 1. Pb(t) represents the best position at the current iteration t. a is a constant number and maxIter represents the maximum allowed iteration number.
Another random parameter r7 would be involved to balance the way, if r7≥0.5, then Eq (1) would be chosen as the formulation to update the positions of individuals, otherwise, Eq (2) would be followed by the individuals to update their positions.
Apparently, the SCA is very easy to understand and coded in applications. The SCA has been very popular just since its birth, and it has been applied in every field in engineering. Li et al. [53] applied the SCA algorithm in solving time series forecasting problems. Reddy et al. [54] solved the unit optimization problem in the electricity market with the SCA algorithm. Nayak et al. [55] made a medical diagnosis system based on the SCA algorithm.
However, the original SCA is so simple that the individuals could only follow their current positions and sine or cosine functionalized distance between them and the global best candidate, which would result in poor global exploration capability and exploitation performance. To increase the capability, Chen et al. introduced the orthogonal learning, multi-swarm, and greedy selection mechanisms to the SCA algorithm, and the improved variant is called OMGSCA [56]. Wang et al. proposed a variant SCA algorithm, various strategies such as Cauchy mutation operator, chaotic local search, and opposition-based learning strategy are introduced [57]. Zhu et al. introduced orthogonal learning to the SCA algorithm [58] and consequently, the basic optimization capabilities of the SCA algorithm are enhanced.
2.2 The HHO algorithm and its defects
The HHO algorithm was proposed in 2014 [52], it might be the first nature-inspired algorithms that introduce multiple updating discipline [59] for individuals in swarms to update their positions. Four special conditions were considered and four types of updating ways were involved, individuals in the HHO swarms would choose a way from the four based on the escaping energy E and the randomness.
a) Exploration procedure.
When the escaping energy E of the rabbit is larger than 1 or smaller than -1, individuals in the HHO swarm would explore the whole area quickly, two strategies would be adopted to do so, which are formulated as follows:(4)
Where, Xrand(t) is a random selected candidate at the current iteration, and Xm(t) represents the averaged position of all of individuals at the current iteration and is calculated with Eq (5). q is a random number in an interval of 0 and 1. And [LB, UB] is the definitional domain of the given problem.
(5)
b) Exploitation procedure.
When a rabbit is found as a prey, individuals in the HHO swarms would perform exploitation procedure according to the status of the rabbits with smart actions. This behavior is controlled by the escaping energy of the rabbits:(6)
Where E0 is the initial energy of the rabbits. When |E|≥1, individuals in the HHO swarms would perform exploration while on the contrary, when |E|<1, individuals would perform exploitation around the domain and select one way to update their positions based on the real-time escaping energy values and a random number.
i) Soft besiege. When r≥0.5 and |E|≥0.5, individuals in the HHO swarms would be aware that the rabbits keep strong and would run faster to escape, therefore, they would fly around the prey and attack it when possible. This attack method can be expressed by the following formula.(7)where J represents the ability of the prey to jump randomly and is formulated as:(8)
Where r5 is another random number in (0,1).
ii) Soft besiege with progressive rapid dives. When r<0.5 and |E|≥0.5, the energy of the prey is large enough to escape the capture, so the Harris hawk needs to dive around the prey several times. This behavior can be expressed by the following formula.(9)where r6 is a random vector, and LF(D) is the Levy flight and calculated as follows:(10)
Where μ and ν are random values fallen in an interval of 0 and 1, and β is a default constant number.
iii) Hard besiege. When r≥0.5 and |E|<0.5, individuals in the HHO swarms would perform a hard besiege under consideration of a low escaping energy of the rabbits, they would be eager to catch the prey and the formulation updated their positions would be relevant mainly to the global best position:(11)
iv) Hard besiege with progressive rapid dives. When r<0.5 and |E|<0.5, the escape energy of the prey is too low to escape, so the Harris Hawk conducts a hard besiege and finally grabs the prey. This behavior can be expressed by the following formula.
(12)
c) Literal review of the HHO algorithm.
It was very clear that there were two types of exploration and exploitation procedure for individuals in the HHO swarms to find the global best optima. And under consideration of the escaping energy and a random controlled parameter, individuals in the swarm would carry on four types of exploitation behavior with smart actions. This behavior is not complicated and it could be applied easily with fast convergence rate. Therefore, the HHO algorithm is also applied in solving various kinds of jobs. However, experiments also soon confirmed that individuals in the HHO swarms would be easily been trapped in local optima and the overall results were not promising. So, lots of improvements were made to increase its capability. Zhao et al. introduced piecewise linear mapping to the HHO algorithm, and the phase conversion mechanism of the HHO algorithm is effectively improved by this mapping [60]. Chen et al. [61] proposed a variant of HHO. Chaos, differential evolution strategy and multi-group strategy are introduced by variants. The three improvement strategies have improved different p laces. Experimental results showed that the performance of this variant is quite superior. Al-Betar et al. [62] proposed several improved versions of the HHO algorithm. Improvement strategies include proportional and linear rank-based strategies. Akdag et al. [63] introduced seven kinds of random distribution functions. This variant HHO algorithm is applied to the optimal power flow problem. Fan et al. [64] introduced a quasi-reflective learning mechanism (QRBL). QRBL effectively improves the population diversity and convergence speed of the HHO algorithm. Gupta et al. [65] introduced a variety of strategies. It includes learning based on opposites, nonlinear energy parameters and the setting of strategy of Harris Hawk when catching prey. These strategies improve the exploration efficiency of the HHO algorithm, and can also avoid the occurrence of local optimal. It has also been furthermore applied in solving other global optimization algorithm such as travelling salesman problems [66, 67], multiple objective feature selection problems [68].
3 The CSCAHHO algorithm
Individuals in the SCA swarms would update their positions according to their current positions and the random involved sine functionalized distance between them and the global best candidates, and therefore, they approach the global best with lower rate. To increase its speed in convergence, the global best candidate might play more import role, such as the formulations as follows:(13)(14)
Where LF(D) represents the Levy flights operation with the following equations:(15)
On the other hand, if the positions were updated as Eqs (13) and (14), the global best candidates would play too much role in updating that individuals in swarms would be easily trapped in local optima. Therefore, more efforts would be included to make the individuals more diverse and the multiple updating ways of individuals in the HHO algorithms might be suitable. We hereby introduce the multiple updating ways for individuals in the HHO swarms to the updating equations further.
Furthermore, in order to improve the randomness of the conversion mechanism, the chaotic map which is called Hybrid map is introduced. Due to the characteristics of chaotic mapping, the random performance of the conversion mechanism is well improved. The following formula can be used to express the Hybrid map.(16)where b = 0.85, u1 = 1.8, u2 = 2.0. The chaos would fluctuate in an interval of -1 and 1, and if the absolute function is introduced, the absolute Hybrid chaos would fluctuate in [0, 1] which is directly the random number generated in computer science with Gauss distribution, as shown in Fig 1.
[Figure omitted. See PDF.]
The formula of the improved conversion mechanism can be expressed as follows.
(17)
The flowchart of the CSCAHHO algorithm is shown in Fig 2. The pseudo-code diagram of the CSCAHHO algorithm is given in Table 2.
[Figure omitted. See PDF.]
[Figure omitted. See PDF.]
The proposed algorithm as shown in Fig 2 would guarantee individuals in swarms having much larger exploitation ratio at the end stage, and embracing more ways to update their positions. Better performance might be expected with only a few complexities increased, the complexity remains O(T∙N∙D+N) Where T, N, D represent the maximum iterations, number of individuals in swarms, and the dimensionality respectively, as shown in Table 2.
4 Simulation experiments
In this section, we would carry on some simulation experiments to verify the capability of the hybridization algorithm. First of all, benchmark function would be introduced to do so. 27 benchmark functions would be involved including 9 unimodal functions (As shown in Table 3), 9 two-dimensional multimodal functions (As shown in Table 4), and 9 multi-dimensional multimodal functions (As shown in Table 5). All of the simulation experiments would be carried out with HP DL380 Gen 10 server with 32GB RAM and Intel Xeon Bronze 3106×2 cores, and Matlab 2017b softwares.
[Figure omitted. See PDF.]
[Figure omitted. See PDF.]
[Figure omitted. See PDF.]
4.1 Experiments setup
In order to prove the capability of the improved hybridized algorithm, both the original SCA and HHO algorithms would be involved in simulation experiments. Furthermore, other famous optimization algorithms such as the AO, STOA, SOA, ChOA, and AOA would also be involved. In order to maintain a same condition, the population size, which is the number of individuals in swarms, would be set 30 and the dimension is 30 for all of the swarms. Final results would be the average over 1000 Monte Carlo simulation experiments. All of the other parameters would be set according to the original version of the algorithms respectively, as shown in Table 6.
[Figure omitted. See PDF.]
4.2 Qualitative experiments
First of all, we carried out the qualitative analysis on the CSCAHHO algorithm, search history, trajectory of the first dimension, average fitness values and the convergence curve were shown in Fig 3.
[Figure omitted. See PDF.]
Fig 3 shows that the hybridized CSCAHHO algorithm could perform well in optimization on either unimodal or multimodal benchmark functions. Individuals would quickly approach the global best position and find the global optima. The trajectories fluctuate at first but will converge very fast. To be noticed that some of the average fitness values increased during iterations, which meant that individuals in swarms would not always approach the global best and consequently perform well in diversification. The overall fast convergence curve of the best fitness value also verifies such conclusion.
4.3 Intensification capability experiments
Unimodal benchmark functions have only one global optima overall their domain. The quicker individuals approach the global best optima; the faster convergence algorithms would achieve. Experiments on unimodal benchmark functions would show the intensification of convergence, and the results were listed in Table 7.
[Figure omitted. See PDF.]
Results in Table 7 showed that in most cases, the hybridized CSCAHHO algorithm would perform best, although sometimes other algorithms such as the SCA, HHO, AO would also find the global optima with the same iteration number.
4.4 Diversification capability experiments
Multimodal benchmark functions have many local optima with one global optima. Individuals would be easily trapped in local optima when they are approaching. To avoid being trapped, individuals should have diversification capability. Simulation experiments results on multimodal benchmark functions were shown in Tables 8 and 9.
[Figure omitted. See PDF.]
[Figure omitted. See PDF.]
Diversification experiments results shown in Tables 8 and 9 also verified that the proposed improved CSCAHHO algorithm would perform better in most cases. However, some of the other algorithms would also achieve the top prize sometimes.
4.5 Acceleration convergence experiments
For a better and clear understanding of the capabilities, acceleration convergence analysis was carried out on all of the involved benchmark functions. Results were shown in Fig 4.
[Figure omitted. See PDF.]
All of the results demonstrate the better performance of the improved CSCAHHO algorithm. The residual errors are the smallest and the convergence curve are more steadier and the convergence rate are more faster.
4.6 Scalability experiments
We are facing more numbers in dimensionality when describing the problems in our real world. Therefore, the capability in solving high dimensional problems is of most important. Scalability experiments would be carried out in this section and the final results were shown in Tables 10–12.
[Figure omitted. See PDF.]
[Figure omitted. See PDF.]
[Figure omitted. See PDF.]
From Tables 9 to 11, we can find that the proposed CSCAHHO algorithm could perform optimization better than others at most times. However, Some of other algorithms such as the STOA, AOA, HHO and so on would also achieve the same condition for some easy problems.
4.7 Friedman statistical test
Friedman statistical test is usually introduced to verify whether the result of the proposed algorithm is better or in the same level with those obtained by other algorithms. In this section, simulation experiments would be carried out for 30 times, and Friedman statistical test would be checked, results were shown in Table 13.
[Figure omitted. See PDF.]
Results in Table 13 showed that the proposed CSCAHHO algorithm would perform significantly better than other algorithms almost all the times, we can further draw the conclusion that the proposed CSCAHHO algorithm would perform quite better in optimizing traditional benchmark functions, either they are unimodal or multimodal, scalable or in fixed dimensionality.
4.8 Experiments on the CEC2014 benchmark function
In order to verify the better performance of the proposed CSCAHHO algorithm, we would further carry on some simulation experiments on the CEC 2014 competitive problems. The results were shown in Table 14.
[Figure omitted. See PDF.]
There are totally 150 best values in results of simulation experiments on CEC2014 test functions as showed in Table 14. 99 of them are achieved by the proposed CSCAHHO algorithm accounting for 66%.
5 Solving engineering design problems
An apparent conclusion could be drawn that the proposed CSCAHHO algorithm could perform well on benchmark functions, either unimodal or multimodal, scalable or non-scalable, benchmark or CEC competitive. The overall results are so promising that we could not resist the temptation applying in solving real engineering problems. Therefore, we would further carry on some simulation experiments on classical real engineering problems, which are also used world-wide in testing the capability of optimization algorithms.
In this section, the results would also be compiled to those obtained by SCA, HHO, AOA, SOA, STOA, AO, and ChOA algorithms. Each algorithm would also be run for 20 times and Wilcoxon Rank Sum test would also be calculated.
5.1 Gear design problem
The gear design problem is a well-known engineering design problem [69]. The problem is that the cost of the gear ratio is minimized by optimizing the number of teeth of the gear. This question is unconstrained.
Consider:
Objective:
Variable ranges:
The experimental results as shown in Table 15, show that the CSCAHHO algorithm can obtain the best value in the design of gear. By observing the value of P, it can be seen that the results obtained by the CSCAHHO algorithm are significantly different from other algorithms.
[Figure omitted. See PDF.]
5.2 Welded beam design problem
The purpose of the welded beam design problem is to reduce the manufacturing cost of the design, and its essence is to minimize the cost problem. The purpose of the welded beam design problem is to reduce the manufacturing cost of the design, and its essence is to minimize the cost problem. This question involves four variables: weld thickness (h), the length (l), height (t), and weld thickness (h) of the bar. This question contains 7 constraints.
Consider:
Objective:
Subject to:
Variable ranges:where:
The experimental results as shown in Table 16, show that the CSCAHHO algorithm can obtain the best results, and there is a difference between the results of the comparison algorithm.
[Figure omitted. See PDF.]
5.3 Compression spring design problem
The problem of compression spring design is a well-known problem in mechanical engineering. The purpose of this problem is to minimize the weight of the tension/compression spring. This question contains three variables: wire diameter (d), average coil diameter (D), and effective number of coils (N). There are also 4 constraints.
Consider:
Objective:
Subject to:
Variable ranges:
The experimental results are shown in Table 17. The CSCAHHO algorithm obtained the best value. Compared with the comparison algorithm, the results obtained by CSCAHHO are significantly different.
[Figure omitted. See PDF.]
6 Discussions and conclusions
In this paper, a chaotic hybridization algorithm of Sine Cosine algorithm (SCA) and Harris Hawk optimization algorithm (HHO) are proposed. This algorithm extracts the exploration capabilities of the improved SCA algorithm and the exploitation capabilities of the HHO algorithm, and then these two capabilities are mixed. The improved SCA algorithm adds flight operators and tends to the global optimum, which improves the global search capability of the SCA algorithm. Inspired by the phase transition in the HHO algorithm, its control parameter E is introduced. In order to increase the randomness in the control parameter, a chaotic map is introduced. In order to evaluate the performance of the CSCAHHO algorithm accurately, 27 standard functions and CEC2014 benchmark functions and three engineering design questions were tested. The experimental results are compared with other meta-heuristic algorithms, which prove the CSCAHHO algorithm has better global exploration capabilities, faster convergence speed, and higher convergence accuracy.
Although the proposed CSCAHHO algorithm is a hybridization, however, the detailed simulation experiments carried out in this paper verified its better performance. We can see those multiple ways for individuals to update their equations, or chaotic improvements, even the energy parameter balancing the exploration and exploitation procedure during iterations, play an important role in improving the capability of the existed algorithms. Hybridization algorithms would be more efficient in optimization. How to made an easy hybridization of the existed algorithms with fast convergence, low residual errors, stability and steadiness, might be a promising work in the future.
Citation: Zhang Y-J, Yan Y-X, Zhao J, Gao Z-M (2022) CSCAHHO: Chaotic hybridization algorithm of the Sine Cosine with Harris Hawk optimization algorithms for solving global optimization problems. PLoS ONE 17(5): e0263387. https://doi.org/10.1371/journal.pone.0263387
About the Authors:
Yu-Jun Zhang
Roles: Software, Writing – original draft
Affiliation: School of Electronics and Information Engineering, Jingchu University of Technology, Jingmen, China
https://orcid.org/0000-0003-3016-8843
Yu-Xin Yan
Roles: Data curation
Affiliation: Academy of Arts, Jingchu University of Technology, Jingmen, China
Juan Zhao
Roles: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology
E-mail: [email protected]
Affiliation: School of Electronics and Information Engineering, Jingchu University of Technology, Jingmen, China
https://orcid.org/0000-0001-7047-2001
Zheng-Ming Gao
Roles: Conceptualization, Data curation, Formal analysis, Funding acquisition, Writing – review & editing
Affiliations School of Computer Engineering, Jingchu University of Technology, Jingmen, China, Institute of Intelligent Information Technology, Hubei Jingmen Industrial Technology Research Institute, Jingmen, China
https://orcid.org/0000-0003-1566-7134
1. Yang X-S. Nature-inspired optimization algorithms: Challenges and open problems. Journal of Computational Science. 2020;46:101104. https://doi.org/10.1016/j.jocs.2020.101104.
2. Wolpert DH, Macready WG. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation. 1997;1(1):67–82.
3. Mirjalili S, Lewis A. The Whale Optimization Algorithm. Advances in Engineering Software. 2016;95:51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008.
4. Cheraghalipour A, Hajiaghaei-Keshteli M, Paydar MM. Tree growth algorithm (tga): A novel approach for solving optimization problems. Engineering Applications of Artificial Intelligence. 2018;72:393.
5. Simon D. Biogeography-based optimization. IEEE Transactions on Evolutionary Computation. 2008;12(6):702.
6. Abualigah L, Diabatb A, Mirjalilid S, Elazizf MA, Gandomih AH. The Arithmetic Optimization Algorithm. Computer Methods in Applied Mechanics and Engineering. 2021;376:113609.
7. Frenzel JF. Genetic algorithms. IEEE Potentials. 1993;12(3):21–4.
8. SR A., ES M., Tapabrata R. Differential evolution with dynamic parameters selection for optimization problems. IEEE Transactions on Evolutionary Computation. 2014;18(5):689.
9. Koza JR, Rice JP. Automatic programming of robots using genetic programming. Proceedings of the Tenth 20 Computational Intelligence and Neuroscience National Conference on Artificial Intelligence. 1992.
10. Beyer H-G, Schwefel H-P. Evolution strategies–A comprehensive introduction Natural Computing. 2002;1(1):3.
11. Zaman HRR, Gharehchopogh FS. An improved particle swarm optimization with backtracking search optimization algorithm for solving continuous optimization problems. Engineering with Computers. 2021.
12. Rao R.V., Savsani VJ, Vakharia D.P. Teaching–learning-based optimization: an optimization method for continuous non-linear large scale problems. Information sciences. 2012;183(1):1.
13. Geem ZW, Kim JH, Loganathan GV. A new heuristic optimization algorithm: harmony search. Simulation. 2001;76(2):60.
14. Atashpaz-Gargari E, Lucas C. Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition. 2007 IEEE Congress on Evolutionary Computation. 2007:4661.
15. Tan Y, Zhu Y. Fireworks algorithm for optimization. Advances in Swarm Intelligence. 2010:355.
16. Zhang Q, Wang R, Juan Yang KD, Li Y, Hu J. Collective decision optimization algorithm: A new heuristic optimization method. Neurocomputing. 2017;221:123.
17. Kumar M, J.Kulkarni A, Satapathy SC. Socio evolution & learning optimization algorithm: A socio-inspired optimization methodology. Future Generation Computer Systems. 2018;81:252.
18. Sabat A, RT A. Child Drawing Development Optimization Algorithm based on Child’s Cognitive Development. Arabian Journal for Science and Engineering. 2021.
19. Qamar A, Irfan Y, Mehreen S. Political Optimizer: A novel socio-inspired meta-heuristic for global optimization. Knowledge-based systems. 2020;195:105709.
20. Hashim FA, Houssein EH, Mabrouk MS, Al-Atabany W, Mirjalili S. Henry gas solubility optimization: A novel physics-based algorithm. Future Generation Computer Systems. 2019;101:646.
21. Erol OK, Eksin I. A new optimization method: big bang–big crunch. Advances in Engineering Software. 2006;37(2):106.
22. Mirjalili S, Mirjalili SM, Hatamlou A. Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput & Applic. 2016;27(2):495.
23. Abedinpourshotorban H, Shamsuddin SM, Beheshti Z, Jawawi D N.A. Electromagnetic field optimization: A physics-inspired metaheuristic optimization algorithm. Swarm and Evolutionary Computation. 2016;26:8.
24. Rashedi E, Nezamabadi-pour H, Saryazdi S. GSA: A Gravitational Search Algorithm. Information Sciences. 2009;179:2232.
25. Kaveh A., Dadras A. A novel meta-heuristic optimization algorithm: thermal exchange optimization. Advances in Engineering Software. 2017;110:69.
26. Formato RA. Central force optimization. Progress in Electromagnetics Research. 2007;77:425.
27. Shayanfar H, Gharehchopogh FS. Farmland fertility: A new metaheuristic algorithm for solving continuous optimization problems. Applied Soft Computing. 2018;71:728–46. https://doi.org/10.1016/j.asoc.2018.07.033.
28. Gharehchopogh FS, Maleki I, Dizaji ZA. Chaotic vortex search algorithm: metaheuristic algorithm for feature selection. Evolutionary Intelligence. 2021.
29. Karaboga D, Basturk B. A powerful and efficient algorithm for numerical function optimization: artificial bee colony(abc) algorithm. Journal of Global Optimization. 2007;39(3):459.
30. Gandomi AH, Yang X-S, Alavi AH. Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Engineering with computers. 2013;29(1):17.
31. Gharehchopogh FS, Shayanfar H, Gholizadeh H. A comprehensive survey on symbiotic organisms search algorithms. Artificial Intelligence Review. 2020;53(3):2265–312.
32. Ghafori S, Gharehchopogh FS. Advances in Spotted Hyena Optimizer: A Comprehensive Survey. Archives of Computational Methods in Engineering. 2021.
33. Yazdani M, Jolai F. Lion optimization algorithm (loa): a nature-inspired metaheuristic algorithm. Journal of computational design and engineering. 2016;3(1):24.
34. Eberhart R, Kennedy J. A new optimizer using particle swarm theory. MHS’95 Proceedings of the Sixth International Symposium on Micro Machine and Human Science. 1995:39.
35. Yang X-S. Firefly algorithms for multimodal optimization. International symposium on stochastic algorithms. 2009:169.
36. Mirjalili S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowledge-based systems. 2015;89:228.
37. Dorigo M, Maniezzo V, Colorni A. Ant system: optimization by a colony of cooperating agents. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). 1996;26(1):29. pmid:18263004
38. Faramarzi A, Heidarinejad M, Mirjalili S, H.Gandomic A. Marine predators algorithm: A nature-inspired metaheuristic. Expert Systems with Applications. 2020;152:113377.
39. Li S, Chen H, Wang M, Heidari AA, Mirjalili S. Slime mould algorithm: A new method for stochastic optimization. Future Generation Computer Systems. 2020;111:300.
40. Saremi S, Mirjalili S, Lewis A. Grasshopper optimisation algorithm: theory and application. Advances in Engineering Software. 2017;105:30.
41. Mohammadi-Balani A, Nayeri MD, Azar A, Taghizadeh-Yazdi M. Golden eagle optimizer: A nature-inspired meta-heuristic algorithm. Computers & Industrial Engineering. 2021;152:107050.
42. M. K, M.R. M. Chimp Optimization Algorithm. Expert Systems with Applications. 2020;149(1):113338.
43. Gaurav D, Amandeep K. STOA: A bio-inspired based optimization algorithm for industrial engineering problems. Engineering Applications of Artificial Intelligence. 2019;82:148.
44. Gaurav D, Vijay K. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowledge-based systems. 2019;165:169.
45. Abualigah L, Yousri D, Elaziz MA, A.Ewees A, Al-qaness MAA, Gandomi AH. Aquila Optimizer: a novel meta-heuristic optimization algorithm. Computers & Industrial Engineering. 2021;157:107250.
46. Jia H, Peng X, Lang C. Remora optimization algorithm. Expert Systems with Applications. 2021;185:115665.
47. Abdollahzadeh B, Gharehchopogh FS, Mirjalili S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Computers & Industrial Engineering. 2021;158:107408. https://doi.org/10.1016/j.cie.2021.107408.
48. Abdollahzadeh B, Soleimanian Gharehchopogh F, Mirjalili S. Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. International Journal of Intelligent Systems. 2021;36(10):5887–958. https://doi.org/10.1002/int.22535.
49. Gharehchopogh FS, Gholizadeh H. A comprehensive survey: Whale Optimization Algorithm and its applications. Swarm and Evolutionary Computation. 2019;48:1–24. https://doi.org/10.1016/j.swevo.2019.03.004.
50. GAO Z-M, ZHAO J, HU Y-R, Chen H. The Challenge for the Nature-Inspired Global Optimization Algorithms: Non-Symmetric Benchmark Functions. IEEE Access. 2021;9:106317–39.
51. Seyedali M. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowledge-based systems. 2016;96:120.
52. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H. Harris Hawks optimization: algorithm and applications. Future Generation Computer Systems. 2019;97:849.
53. Sai L, HuaJing F, XiaoYong L. Parameter optimization of support vector regression based on sine cosine algorithm. Expert Systems With Applications. 2018;91:63.
54. Srikanth RK, Kumar PL, BK P, Rajesh K. A new binary variant of sine-cosine algorithm: development and application to solve proit-based unit commitment problem. Arabian Journal for Science and Engineering. 2017;43:4041.
55. Ranjan ND, Ratnakar D, Banshidhar M, Shuihua W. Combining extreme learning machine with modiied sine cosine algorithm for detection of pathological brain. Computers & Electrical Engineering. 2018;68:366.
56. Hao C, Asghar HA, Xuehua Z, Lejun Z, Huiling C. Advanced orthogonal learning-driven multi-swarm sine cosine optimization: framework and case studies. Expert Systems with Applications. 2019;45:50.
57. Huiling C, Mingjing W, Xuehua Z. A multi-strategy enhanced sine cosine algorithm for global optimization and constrained practical engineering problems. Applied Mathematics and Computation. 2020;369:124872.
58. Wei Z, Chao M, Xuehua Z, Mingjing W, Asghar HA, Huiling C, et al. Evaluation of sino foreign cooperative education project using orthogonal sine cosine optimized kernel extreme learning machine. IEEE Access. 2020;8:61107.
59. Gao Z-M, Zhao J, Li S-R, Hu Y-R. The improved equilibrium optimization algorithm with multiple updating discipline. Journal of Physics: Conference Series. 2020;1682:012054.
60. Zhao J, Gao Z-M, Zhang Y-J. Piecewise Linear map enabled Harris Hawk optimization algorithm. Journal of Physics: Conference Series. 2021;1994(1):012038.
61. Hao C, Asghar HA, Huiling C, Mingjing W, Zhifang P, H GA. Multi-population differential evolution-assisted Harris hawks optimization: Framework and case studies Future Generation Computer Systems. 2020;111:175.
62. Azmi A-BM, A AM, Asghar HA, Huiling C, Habes A-k, Chengye L. Survival exploration strategies for harris hawks optimizer. Expert Systems with Applications. 2020;168:114243.
63. Ozan A, Abdullah A, Celaleddin Y. Modification of harris hawks optimization algorithm with random distribution functions for optimum power flow problem. Neural Computing and Applications. 2020;33(6):1959.
64. Qian F, Zhenjian C, Zhanghua X. A novel quasi-reflected Harris hawks optimization algorithm for global optimization problems. Soft Computing. 2020;24(19):14825.
65. Shubham G, Kusum D, Asghar HA, Hossein M, Mingjing W. Opposition-based learning Harris hawks optimization with advanced transition rules: Principles and analysis. Expert Systems with Applications. 2020;158:113510.
66. Gharehchopogh FS, Abdollahzadeh B. An efficient harris hawk optimization algorithm for solving the travelling salesman problem. Cluster Computing. 2021;(4).
67. Benyamin A, Farhad SG, Saeid B. Discrete farmland fertility optimization algorithm with metropolis acceptance criterion for traveling salesman problems. International Journal of Intelligent Systems. 2021;36(3):1270–303. https://doi.org/10.1002/int.22342.
68. Abdollahzadeh B, Gharehchopogh FS. A multi-objective optimization algorithm for feature selection problems. Engineering with Computers. 2021;(2).
69. Gharehchopogh FS, Farnad B, Alizadeh A. A modified farmland fertility algorithm for solving constrained engineering problems. Concurrency and Computation: Practice and Experience. 2021;33(17):e6310. https://doi.org/10.1002/cpe.6310.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022 Zhang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Because of the No Free Lunch (NFL) rule, we are still under the way developing new algorithms and improving the capabilities of the existed algorithms. Under consideration of the simple and steady convergence capability of the sine cosine algorithm (SCA) and the fast convergence rate of the Harris Hawk optimization (HHO) algorithms, we hereby propose a new hybridization algorithm of the SCA and HHO algorithm in this paper, called the CSCAHHO algorithm henceforth. The energy parameter is introduced to balance the exploration and exploitation procedure for individuals in the new swarm, and chaos is introduced to improve the randomness. Updating equations is redefined and combined of the equations in the SCA and HHO algorithms. Simulation experiments on 27 benchmark functions and CEC 2014 competitive functions, together with 3 engineering problems are carried out. Comparisons have been made with the original SCA, HHO, Archimedes optimization algorithm (AOA), Seagull optimization algorithm (SOA), Sooty Tern optimization algorithm (STOA), Arithmetic optimizer (AO) and Chimp optimization algorithm (ChOA). Simulation experiments on either unimodal or multimodal, benchmark or CEC2014 functions, or real engineering problems all verified the better performance of the proposed CSAHHO, such as faster convergence rate, low residual errors, and steadier capability. Matlab code of this algorithm is shared in Gitee with the following address: https://gitee.com/yuj-zhang/cscahho.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer