PLoS ONE
Home On the performance improvement of Butterfly Optimization approaches for global optimization and Feature Selection
On the performance improvement of Butterfly Optimization approaches for global optimization and Feature Selection
On the performance improvement of Butterfly Optimization approaches for global optimization and Feature Selection

Competing Interests: The authors have declared that no competing interests exist.

Article Type: research-article Article History
Abstract

Butterfly Optimization Algorithm (BOA) is a recent metaheuristics algorithm that mimics the behavior of butterflies in mating and foraging. In this paper, three improved versions of BOA have been developed to prevent the original algorithm from getting trapped in local optima and have a good balance between exploration and exploitation abilities. In the first version, Opposition-Based Strategy has been embedded in BOA while in the second Chaotic Local Search has been embedded. Both strategies: Opposition-based & Chaotic Local Search have been integrated to get the most optimal/near-optimal results. The proposed versions are compared against original Butterfly Optimization Algorithm (BOA), Grey Wolf Optimizer (GWO), Moth-flame Optimization (MFO), Particle warm Optimization (PSO), Sine Cosine Algorithm (SCA), and Whale Optimization Algorithm (WOA) using CEC 2014 benchmark functions and 4 different real-world engineering problems namely: welded beam engineering design, tension/compression spring, pressure vessel design, and Speed reducer design problem. Furthermore, the proposed approches have been applied to feature selection problem using 5 UCI datasets. The results show the superiority of the third version (CLSOBBOA) in achieving the best results in terms of speed and accuracy.

Assiriand Mirjalili: On the performance improvement of Butterfly Optimization approaches for global optimization and Feature Selection

Introduction

In recent years, the complexity of real-world engineering optimization problems has been increased rapidly due to the advent of the latest technologies. In order to find the optimal solutions to these problems, many optimization methods have been introduced to find the optimal solutions. These algorithms can be divided into 2 major categories: deterministic and stochastic. In the formal category, for example Linear and non-linear programming [1], the solution of the current iteration is used in the next iteration to get the updated solution. The methods in this category have some limitations such as falling into local optima, single based solutions, and other issues regarding search space as mentioned in [2]. In the latter category stochastic methods, also known as metaheuristics, which generate & use random variables. This category has many advantages such as flexibility, simplicity, gradient-free and independently to the problems. Metaheuristics algorithms have been proposed by studying creatures’ behavior, physical phenomena, or evolutionary concepts and has been successfully applied to many applications [35]. Genetic Algorithm (GA) [6], Differential Evolution (DE) [7], Particle Swarm Optimization (PSO) [8], Artificial Bee Colony (ABC) [9], Ant Colony Algorithm (ACO) [10], and Simulated Annealing (SA) [11] are some of the most conventional metaheuristics algorithms. Recently, numerous number of optimization algorithms have been appeared such as: Cuckoo Search (CS) [12], Gravitational Search Algorithm (GSA) [13], Crow Search Algorithm (CSA) [14], Dragonfly Algorithm (DA) [15], Biogeography-Based Optimization algorithm (BBO) [16], Bat algorithm (BA) [17], Whale Optimization Algorithm (WOA) [18], Grasshopper optimization algorithm (GOA) [19], Emperor penguin optimizer (EPO) [20], Squirrel search algorithm (SSA) [21], Seagull Optimization Algorithm (SOA) [22], Nuclear Reaction Optimization (NRO) [23], Salp swarm algorithm [24], Harris Hawks Optimization (HHO) [25], Slime Mould Algorithm (SMA) [26], Henry Gas Solubility Optimization (HGSO) [27], Elephant Herd Optimization (EHO) [28], Ant-Lion Optimization (ALO) [29] and Moth-Flame Optimization (MFO) [30].

Butterfly optimization algorithm [31] is a novel population-based metaheuristics algorithm that mimics butterflies foraging behavior. BOA has been applied to many fields. In [32] Aygül et al. use BOA to find the maximum power point tracking under partial shading condition (PSC) in photovoltaic (PV) systems. Lal et al. in [33] presented Automatic Generation Control (AGC) to 2 nonlinear power systems using BOA. Also, in [34] Arora and Anand embedded learning automata in BOA. Li et al. in [35] proposed an improved version of BOA using Cross-Entropy method to achieve a better balance between exploration and exploitation. Arora and Anand proposed a binary version of BOA and applied it to the Feature Selection (FS) problem [36]. Another binary version which also applied to feature selection is introduced by Zhang et al. [37] by using new initialization strategy and new operator has been added to transfer function. Likewise, Fan et al. [38] tried to improve BOA performance by adding fragrance coefficient and enhancing local & global search.

A guiding weight and population restart are done by Guo et al. [39]. BOA has been also hybridized with other metaheuristics algorithms such as FPA [40] and ABC [41]. Also, Sharama and Saha in [42] proposed an updated version of BOA using mutualism scheme. In spite of, many real-world problems have been solved by using the original BOA due to its advantages as easy in implementation, simplicity, less number of parameters. However, in some cases like other MH algorithms, it may stuck in local optima regions which lead to premature convergence problems.

However, the success of the above mentioned algorithms in enhancing BOA search capabilities, it still have some limitations and drawbacks: 1) BOA still have difficulties to escape from local optimum region especially when BOA is applied to complex or high dimensional problems. 2) all enhanced BOA variants solve only one problem (Initialization, diversity, and balancing between exploration & exploitation). This encourages and motivates us to introduce some other enhancement.

Opposition-based Learning strategy (OBL) has been integrated with many MH algorithms like PSO [43], GSA [44], ACO [45], GWO [46] and DE [47] to strength their exploration abilities. Also, Chaotic Local Search (CLS) strategy is used in order to make a good balance between exploration and exploitation. CLS concepts was introduced in numerous number of MHs such as PSO [43], Tabu search [48] and ABC [49].

In this paper, three enhanced versions of BOA has been introduced. In the first proposed version Opposition-based Learning strategy is used to enhance the population diversity by checking the opposite random solutions in the initialization phase and the updating step. In the second proposed version, Chaotic Local Search (CLS) has been incorporated in BOA to exploits the regions near to the best solutions. In the last version, both of OBL and CLS are used together to enhance overall performance. To best of our knowledge, this is the first time to use CLS, OBL concepts in BOA algorithms.

This paper is organized as follows: section 2 provides the basics of BOA. The three novel variants and the concepts of OBL & CLS are introduced in section 3. the experiments results & Discussion and Conclusion & Future work are shown in sections 4 and 5 respectively.

1 Preliminaries

In this section, the BOA inspiration and mathematical equations are shown first. Then, the basics of Opposition-based Learing and Chaotic Local Search are presented.

1.1 Butterfly optimization algorithm

The BOA equations and complexity is described in details in the following subsections.

1.1.1 Inspiration & mathematical equations

Butterflies belong to the Lepidoptera class in the Animal Kingdom Linnaean system [50]. In order to find food/mating partner, they used their sense, sight, taste, and smell. Butterfly Optimization Algorithm (BOA) is a recent nature-based algorithm developed by Arora and Singh in 2018 [31]. BOA simulates the behavior of butterflies in food foraging. Biologically, each butterfly has sense receptors that cover all butterfly’s body. These receptors are considered chemoreceptors and are used in smelling/sensing the food/flower fragrance. To model butterflies’ behavior, it’s assumed that each butterfly produce fragrance with some power/intensity. if a butterfly is able to sense fragrance from the best butterfly, it moves towards the position of the best butterfly. On the other hand, if a butterfly can’t sense fragrance, it moves randomly in the search space. In BOA, the fragrance is defined as a function of physical intensity as given in 1.

where pfi refers to the amount of fragrance perceived by other butterflies, c is the sensory modality, I and a refer to stimulus intensity and power exponent respectively. Global search (exploration) and local search (exploitation) phases are given respectively by Eqs 2 and 3.

Algorithm 1 Butterfly Optimization Algorithm (BOA)

1: Initialize Dim, Max_Iter, curr_Iter, Objective Function

2: Generate a uniform distributed solutions (Initial Population)    X = (x1, x2, …, xn)

3: Define sensory modality c, stimulus intensity I, and switch probability p

4: calculate stimulus intensity Ii at xi using f(xi)

5: while (curr_Iter ¡ Max_Iter) do

6:  for each butterfly in (X) do

7:   Calculate fragrance using Eq 1

8:  end for

9:  g* = best butterfly

10:  for each butterfly in (X) do

11:   r = rand()

12:   if r ¡ p then

13:    Update butterfly position using Eq 2

14:   else

15:    Update butterfly position using Eq 3

16:   end if

17:  end for

18:  Update value of a

19: end while

20: Return g*.

1.1.2 Complexity

To be able to compute the BOA complexity, assume the population size is (P), maximum iteration number (N), the problem dimensions (D). Then, the BOA complexity can be calculated as follows O(N(D × P + D × C)) where C refers to the cost of the fitness function = O(NDP + NDC).

1.2 Opposition-based Learning

Tizhoosh in [51] introduced Opposition-based learning (OBL) to accelerate the convergence by calculating the opposite solution of the current one and taking the best of them. In [47] a mathematical proof is given to show that the opposite solutions are more likely to be near optimal than totally random. The opposite solution Xi¯ can be calculated from the following equation

where a, b is the lower bound and the upper bound respectively.

1.3 Chaotic local search

Chaotic system characteristic can be used to make local search operator in order to strengthen the exploitation abilities in solving optimization tasks. Chaos is based on the navigation of deterministic nonlinear complex behavior. There are many chaotic maps in literature such as logistic, singer, tent, piecewise, and sinusoidal. This is because of the efficiency of chaotic map is related to the problem itself as mentioned by Fister et al. [52, 53]. Logistics map is used in this paper and its sequequence can be obtained from the following equation.

where μ = 4, set 0 ≤ C1 ≤ 1 and C1 ≠ 0.25, 0.5, 0.75, 1. To calculate the candidate solution CS from the target position T, the next equation is used.

2 The proposed approaches

2.1 Opposition-Based BOA (OBBOA)

The first version is called OBBOA which improves the performance of BOA by using OBL strategy. OBL enhance the BOA algorithm by improving its ability to explore search space deeply and speed up the reaching to optimal value. This version consists of 2 stages: First, at the initialization stage by calculating the opposite solution to each one in the initialization, then selecting the best N values. Second OBL is embedded in the updating stage. The pseudo-code of this version is given in Alg. 2.

Algorithm 2 Opposition-Based BOA (OBBOA)

1: Initialize Dim, Max_Iter, curr_Iter, Objective Function

2: Generate a uniform distributed solutions (Initial Population)     X = (x1, x2, …, xn)

3: Define sensory modality c, stimulus intensity I, and switch probability p

4: calculate stimulus intensity Ii at xi using f(xi)

5: Compute X¯

6: Select best N from XX¯

7: while (curr_Iter < Max_Iter) do

8:  for each butterfly in (X) do

9:   Calculate fragrance using Eq 1

10:  end for

11:  g* = best butterfly

12:  for each butterfly in (X) do

13:   r = rand()

14:   if r ≤ p then

15:    Update butterfly position using Eq 2

16:   else

17:    Update butterfly position using Eq 3

18:   end if

19:   Calculate x¯

20:   xi=xi¯ if f(xi)<f(xi¯)

21:  end for

22:  Update value of a

23: end while

24: Return g*.

2.2 Chaotic Local Search BOA (CLSBOA)

In the second version which is called CLSBOA, Chaotic Local Search is integrated with BOA to make a proper balance between exploration and exploitation. The pseudo-code of this version is introduced in Alg. 3.

Algorithm 3 Chaotic Local Search BOA (CLSBOA)

1: Initialize Dim, Max_Iter, curr_Iter, Objective Function

2: Generate a uniform distributed solutions (Initial Population)     X = (x1, x2, …, xn)

3: Define sensory modality c, stimulus intensity I, and switch probability p

4: calculate stimulus intensity Ii at xi using f(xi)

5: while (curr_Iter < Max_Iter) do

6:  for each butterfly in (X) do

7:   Calculate fragrance using Eq 1

8:  end for

9:  g* = best butterfly

10:  for each butterfly in (X) do

11:   r = rand()

12:   if r < p then

13:    Update butterfly position using Eq 2

14:   else

15:    Update butterfly position using Eq 3

16:   end if

17:  end for

18:  Generate the candiate solution CS by performing CLS strategy

19:  g* = CS if f(CS)<f(g*)

20:  Update value of a

21: end while

22: Return g*.

2.3 Chaotic Local Search Opposition-Based BOA (CLSOBBOA)

In this version, both of the 2 previous modification has been added together to enhance BOA and get the most near optimal solution.

Complexity:

To be able to compute the BOA complexity, assume the population size is (P), maximum iteration number (N), the problem dimensions (D). Then, the CLSOBBOA complexity can be calculated as follows O(BOA) + O(OBL) + O(CLS) = O(N(D × P + D × C + P + P)) where C refers to the cost of the fitness function = O(NDP + NDC)

Algorithm 4 Chaotic Local Search & Opposition-Based BOA (CLSOBBOA)

1: Initialize Dim, Max_Iter, curr_Iter, Objective Function

2: Generate a uniform distributed solutions (Initial Population)     X = (x1, x2, …, xn)

3: Define sensory modality c, stimulus intensity I, and switch probability p

4: calculate stimulus intensity Ii at xi using f(xi)

5: Compute X¯

6: Select best N from XX¯

7: while (curr_Iter ¡ Max_Iter) do

8:  for each butterfly in (X) do

9:   Calculate fragrance using Eq 1

10:  end for

11:  g* = best butterfly

12:  for each butterfly in (X) do

13:   r = rand()

14:   if r ¡ p then

15:    Update butterfly position using Eq 2

16:   else

17:    Update butterfly position using Eq 3

18:   end if

19:   Calculate x¯

20:   xi=xi¯ if f(xi)<f(xi¯)

21:  end for

22:  Generate the candiate solution CS by performing CLS strategy

23:  g* = CS if f(CS)<f(g*)

24:  Update value of a

25: end while

26: Return g*.

3 Experiments

In this section, the proposed algorithms are tested using CEC as shown in the first subsection after that these algorithms are applied to 5 UCI datasets.

3.1 Benchmark functions

In this subsection, 30 functions have been used to compare algorithms using many statistical measure.

3.1.1 Test functions

A set of 30 functions from CEC 2014 are used to compare the performance of the proposed algorithms with other state-of-art algorithms. This benchmark functions have new characteristics such as rotated trap problems, graded level of linkage, and composing functions through dimensions-wise properties. This benchmark can be categorized to the following (Unimodal, Multi-modal, Hybrid, and Composite functions) and the definition of these function can be shown in Table 1 where opt. refers to the mathematical optimal value and the bound of the variables in the search space falls in the interval ∈[−100, 100].

Table 1
CEC2014 functions.
No.TypesNameOpt.
F1(CEC)Unimodal fnctionsRotated high conditioned elliptic function100
F2(CEC)Rotated bent cigar function200
F3(CEC)Rotated discus function300
F4(CEC)Simple multimodal functionsShifted and rotated Rosenbrocks function400
F5(CEC)Shifted and rotated Ackleys function500
F6(CEC)Shifted and rotated Weierstrass function600
F7(CEC)Shifted and rotated Griewanks function700
F8(CEC)Shifted Rastrigins function800
F9(CEC)Six Hump Camel Back900
F10(CEC)Shifted and rotated Rastrigins function1000
F11(CEC)Shifted and rotated Schwefels function1100
F12(CEC)Shifted and rotated Katsuura function1200
F13(CEC)Shifted and rotated HappyCat function1300
F14(CEC)Shifted and rotated HGBat function1400
F15(CEC)Shifted and rotated Expanded Griewanks plus Rosenbrocks function1500
F16(CEC)Shifted and rotated Expanded Scaffers F6 function1600
F17(CEC)Hybrid functionsHybrid function 1 (N = 3)1700
F18(CEC)Hybrid function 2 (N = 3)1800
F19(CEC)Hybrid function 3 (N = 4)1900
F20(CEC)Hybrid function 4 (N = 4)2000
F21(CEC)Hybrid function 5 (N = 5)2100
F22(CEC)Hybrid function 6 (N = 5)2200
F23(CEC)Composition functionsComposition function 1 (N = 5)2300
F24(CEC)Composition function 2 (N = 3)2400
F25(CEC)Composition function 3 (N = 3)2500
F26(CEC)Composition function 4 (N = 5)2600
F27(CEC)Composition function 5 (N = 5)2700
F28(CEC)Composition function 6 (N = 5)2800
F29(CEC)Composition function 7 (N = 3)2900
F30(CEC)Composition function 8 (N = 3)3000

3.1.2 Comparative algorithm

In order to test our algorithms, we compare the 3 proposed versions with many metaheuristic algorithms as the native Butterfly Optimization Algorithm (BOA), Grey Wolf Optimizer (GWO), Moth-flame Optimization (MFO), Particle warm Optimization (PSO), Sine Cosine Algorithm (SCA), and Whale Optimization Algorithm (WOA) [54].

The individual search agent is set to 50 and the maximum number of iteration is fixed to 500. The parameters setting of all comparative algorithms is given in Table 2.

Table 2
Meta-heuristic algorithms parameters settings.
Alg.ParameterValue
BOAa0.1
GWOa[0, 2]
MFOt[−1, 1]
b1
PSOwMaxt0.9
wMin0.2
c12.0
c22.0
SCAa2
WOAa2
b2

3.1.3 Results & discussion

In this section, the proposed versions (OBBOA, CLSBOA, and CLSOBBOA) are presented and compared with the original BOA as shown in Table 3. From this table, it has been noticed that the 3rd proposed version called (CLSOBBOA) have achieved the best results in terms of Average/Mean, Best, Worst, and Standard Deviation (SD).

Table 3
The comparison results of all algorithms over 30 functions.
FAlgorithmBestWorstMeanSD
F1BOA3.5971e+073.1810e+081.0080e+061.2667e+5
OBBOA1.6723e+072.3640e+086.7445e+074.7897e+07
CLSBOA5.7586e+077.2621e+081.2429e+081.4759e+08
CLSOBBOA9.5454e+041.9320e+078.0108e+077.2858e+07
F2BOA2.6574e+091.0043e+104.4261e+092.5605e+09
OBBOA7.1006e+088.8216e+093.3621e+091.8186e+09
CLSBOA2.2787e+099.2016e+094.1975e+092.0389e+09
CLSOBBOA6.6739e+036.3838e+073.2066e+054.5402e+3
F3BOA1.2913e+041.8349e+041.4306e+042.5048e+03
OBBOA7.2557e+031.7454e+041.2592e+042.7249e+03
CLSBOA1.1739e+041.7093e+041.3854e+042.5865e+03
CLSOBBOA8.5819e+031.5012e+041.1610e+041.7285e+3
F4BOA2.0912e+033.8529e+032.6292e+035.6597e+02
OBBOA1.2404e+034.4836e+032.3235e+038.2272e+02
CLSBOA1.9510e+035.3563e+032.7072e+031.0699e+03
CLSOBBOA4.2516e+22.7130e+038.8079e+0213.310
F5BOA5.2042e+025.2066e+025.2049e+020.1050
OBBOA5.2036e+025.2061e+025.2047e+020.0786
CLSBOA5.2032e+025.2052e+025.2038e+020.0775
CLSOBBOA5.2028e+025.2064e+025.2040e+020.0565
F6BOA6.0708e+026.0956e+026.0832e+021.0965
OBBOA6.0725e+026.0911e+026.0840e+020.5863
CLSBOA6.0770e+026.1002e+026.0850e+020.9281
CLSOBBOA6.0190e+026.1009e+026.0843e+020.577
F7BOA8.0304e+029.5780e+028.7396e+0262.3545
OBBOA7.6548e+029.7498e+028.5850e+0256.6647
CLSBOA8.1979e+028.8123e+028.4222e+0236.6645
CLSOBBOA7.0012e+028.8830e+027.3922e+020.06032
F8BOA8.6394e+028.8581e+028.7199e+0210.1535
OBBOA8.5749e+028.9292e+028.7059e+029.0216
CLSBOA8.5810e+028.9173e+028.6610e+0211.4144
CLSOBBOA8.0436e+28.8665e+028.3193e+022.56771
F9BOA9.6165e+029.7920e+029.6592e+029.0121
OBBOA9.4129e+029.8231e+029.6468e+0210.3115
CLSBOA9.5419e+029.8318e+029.6244e+0211.6616
CLSOBBOA9.5529e+029.7704e+029.6255e+026.2637
F10BOA2.5486e+033.0370e+032.6438e+031.8702e+02
OBBOA2.2832e+033.0108e+032.5974e+031.9311e+02
CLSBOA2.5452e+033.0265e+032.6622e+032.1390e+02
CLSOBBOA1.1924e+33.0311e+031.6173e+031.4853e+02
F11BOA2.6618e+033.1056e+032.7892e+031.6215e+02
OBBOA2.2947e+033.1046e+032.7424e+032.1860e+02
CLSBOA2.6374e+033.2235e+032.7841e+032.2713e+02
CLSOBBOA1.7170e+032.8534e+032.7774e+031.6215e+2
F12BOA1.2017e+031.2023e+031.2019e+030.2708
OBBOA1.2011e+031.2022e+031.2017e+030.3247
CLSBOA1.2015e+031.2021e+031.2017e+030.2306
CLSOBBOA1.2009e+031.2019e+031.2015e+030.1381
F13BOA1.3039e+031.3054e+031.3045e+030.6891
OBBOA1.3032e+031.3052e+031.3041e+030.5280
CLSBOA1.3037e+031.3062e+031.3044e+031.0008
CLSOBBOA1.3001e+031.3053e+031.3022e+030.05490
F14BOA1.4282e+031.4504e+031.4354e+039.3979
OBBOA1.4245e+031.4490e+031.4379e+036.8308
CLSBOA1.4333e+031.4541e+031.4373e+038.0703
CLSOBBOA1.4002e+031.4465e+031.4320e+030.1292
F15BOA2.8689e+036.500e+035.0786e+03
OBBOA1.9296e+031.3932e+044.4538e+033.4833e+03
CLSBOA3.2042e+033.1369e+047.7426e+037.2418e+03
CLSOBBOA1.5024e+031.0747e+044.7610e+031.0485e+02
F16BOA1.6035e+031.6038e+031.6036e+030.1655
OBBOA1.6033e+031.6038e+031.6036e+030.1570
CLSBOA1.6033e+031.6037e+031.6035e+030.2070
CLSOBBOA1.6033e+31.6038e+031.6035e+030.0598
F17BOA2.3517e+055.2617e+053.3745e+051.1954e+05
OBBOA7.6198e+045.1763e+052.2619e+051.2488e+05
CLSBOA3.1117e+056.5610e+054.0887e+051.5533e+05
CLSOBBOA4.8701e+047.0552e+058.5776e+042.5502e+04
F18BOA1.7587e+044.8365e+063.0925e+051.0695e+06
OBBOA1.0590e+041.7808e+061.2400e+053.9117e+05
CLSBOA1.3717e+041.6055e+061.3132e+053.6880e+05
CLSOBBOA7.9930e+31.3861e+054.6620e+043.3053
F19BOA1.9279e+031.9786e+031.9389e+0318.0575
OBBOA1.9071e+031.9772e+031.9268e+0319.6003
CLSBOA1.9265e+032.0442e+031.9461e+0329.3827
CLSOBBOA1.9026e+31.9512e+031.9250e+032.9060
F20BOA7.6669e+038.6363e+042.0606e+041.98136e+04
OBBOA2.2118e+033.0375e+041.3241e+048.30101e+03
CLSBOA9.5177e+037.5429e+041.8474e+041.64929e+04
CLSOBBOA5.1116e+033.6863e+041.0850e+048.91533e+03
F21BOA4.5448e+041.6143e+063.2084e+054.02760e+05
OBBOA1.7469e+049.8848e+051.5497e+052.17338e+05
CLSBOA2.7288e+046.7756e+051.8627e+052.16049e+05
CLSOBBOA3.6120e+034.0706e+041.3284e+041.5581e+03
F22BOA2.4161e+032.5767e+032.4490e+0361.2592
OBBOA2.2804e+032.4836e+032.3877e+0357.6424
CLSBOA2.3370e+032.7499e+032.4365e+031.05537e+02
CLSOBBOA2.230e+032.4935e+032.3890e+0318.5703
F23BOA2.5000e+0325002500
OBBOA2.5000e+032.5000e+0325004.1730e-13
CLSBOA2.5000e+0325002500
CLSOBBOA2.5000e+032.5000e+0325004.1730e-13
F24BOA2.5795e+0326002.5918e+0312.1467
OBBOA2.5544e+0326002.5877e+0314.9424
CLSBOA2.5927e+0326002.5968e+035.6880
CLSOBBOA2.5592e+03326002.5907e+038.6636
F25BOA270027002.6982e+035.4556
OBBOA2.6822e+032.7000e+032.6978e+035.4111
CLSBOA270027002.6990e+032.9984
CLSOBBOA2.682e+032.7000e+032.6998e+035.45561
F26BOA2.7023e+032.7067e+032.7034e+031.7518
OBBOA2.7003e+032.7033e+032.7016e+030.9510
CLSBOA2.7023e+032.7249e+032.7043e+035.0978
CLSOBBOA2.7008e+032.7033e+032.7018e+030.0909
F27BOA2.8612e+033.2305e+033.0001e+031.5009e+02
OBBOA2.7465e+033.1371e+032.9313e+031.4431e+02
CLSBOA2.7710e+032.9000e+032.8521e+0369.0391
CLSOBBOA2.7480e+032.9000e+032.8568e+030.469+e02
F28BOA30003.5324e+033.1976e+032.08291e+02
OBBOA3.3249e+033.6655e+033.5018e+031.02142e+02
CLSBOA3.0000e+033.0000e+033.0000e+030.0054
CLSOBBOA3.0000e+033.0000e+033.0000e+031.790e-4
F29BOA31001.1511e+052.4659e+043.3954e+04
OBBOA31001.0710e+062.7354e+053.7340e+05
CLSBOA31003.5565e+044.7232e+037.2595e+03
CLSOBBOA31005.8748e+053.7343e+041.6260e+03
F30BOA5.9971e+032.6134e+041.0799e+046.7561e+03
OBBOA3.2000e+033.7058e+041.2980e+048.5345e+03
CLSBOA32002.4155e+048.0275e+036.3396e+03
CLSOBBOA4.1627e+035.8775e+041.1109e+048.4793e+02

Table 4 shows the comparison of CLSOBBOA (the best proposed version) with other state-of-art metaheuristics algorithm. It’s noticed that CLSOBBOA achieve best results and ranked first in almost half of the benchmark functions. Figs 1, 2 and 3 show the convergence curve of these functions. Also, Wilcoxon rank sum [55, 56] test has been performed between CLSOBBOA and the native BOA as given in Table 5 where the significance level has been considered 5%.

Table 4
The comparison results of all algorithms over 30 functions.
F1F2F3
AvgStdAvgStdAvgStd
CLSOBBOA9.5454e+41.2667e+56.6739e+34.5402e+38.5819e+031.7285e+3
BOA1.0080e+085.2571e+074.4261e+092.5605e+091.4306e+042.5048e+03
GWO9.5526e+065.0611e+069.0682e+072.4562e+081.3114e+049.0242e+03
MFO3.5572e+067.2399e+061.1676e+092.2771e+081.9628e+041.5038e+04
PSO2.5249e+078.0505e+065.0614e+32.3787e+35.2453e+33.9510e+3
SCA1.2039e+075.1675e+069.3345e+084.7255e+081.1411e+048.8356e+03
WOA1.1876e+078.1442e+062.0966e+071.2829e+075.9297e+043.9137e+04
F4F5F6
AvgStdAvgStdAvgStd
CLSOBBOA4.2516e+213.3105.2028e+020.05656.019e+20.577
BOA2.6292e+035.6597e+025.2049e+020.10506.0832e+021.0965
GWO4.3397e+025.92975.2044e+020.12276.0253e+021.0790
MFO4.2751e+021.3855e+025.2012e+20.13296.0456e+021.7994
PSO1.1304e+0320.3435.2040e+020.10736.0722e+021.0849
SCA4.9472e+0232.1425.2048e+020.12306.0762e+021.4810
WOA4.5614e+0234.9005.2024e+020.10896.0854e+021.5315
F7F8F9
AvgStdAvgStdAvgStd
CLSOBBOA7.0012e+20.060328.0436e+22.567719.5529e+026.2637
BOA8.7396e+0262.35458.7199e+0210.15359.6592e+029.0121
GWO7.0123e+020.773488.1427e+026.451959.1895e+027.9132
MFO8.0137e+0216.41458.2414e+0210.67459.3011e+0212.381
PSO7.0097e+022.059178.5830e+026.34509.1282e+24.4846
SCA7.1329e+024.262728.4631e+0211.24889.5284e+029.6501
WOA7.0165e+020.505188.5151e+0220.85189.4555e+0220.916
F10F11F12
AvgStdAvgStdAvgStd
CLSOBBOA1.1924e+31.4853e+021.7170e+31.6215e+21.2009e+30.1381
BOA2.6438e+031.8702e+022.7892e+032.9413e+021.2019e+030.2708
GWO1.4089e+031.9919e+022.3330e+031.6815e+021.2012e+030.6264
MFO1.5960e+032.5578e+022.0165e+033.0732e+021.2003e+030.2121
PSO2.3420e+031.2159e+21.7719e+033.6301e+021.2013e+030.4253
SCA2.0964e+032.4770e+022.5883e+031.9564e+021.2015e+030.3095
WOA1.6769e+033.5197e+022.2302e+033.3642e+021.2012e+030.3191
F13F14F15
AvgStdAvgStdAvgStd
CLSOBBOA1.3001e+30.054901.4002e+30.12921.5024e+031.04857
BOA1.3045e+030.68911.4354e+039.39796.5005e+035.07867e+03
GWO1.3002e+030.066161.4004e+030.18981.8759e+032.0916e+02
MFO1.3003e+030.166211.4007e+031.04471.5041e+0310.8810
PSO1.3034e+030.240701.4002e+30.05851.5014e+30.75305
SCA1.3007e+030.175441.4015e+030.65021.5110e+033.99372
WOA1.3004e+030.189681.4243e+035.17561.5086e+036.54693
F16F17F18
AvgStdAvgStdAvgStd
CLSOBBOA1.6033e+30.05984.8701e+042.5502e+047.9930e+33.3053e+3
BOA1.6036e+030.16553.3745e+051.1954e+053.0925e+051.06959e+06
GWO1.6028e+030.38277.0802e+041.5951e+051.3989e+041.06251e+04
MFO1.6033e+30.48421.9565e+053.3550e+052.2320e+041.50917e+04
PSO1.6028e+030.42331.2951e+42.0212e+49.1989e+031.06636e+04
SCA1.6035e+030.22716.4658e+041.5044e+053.0945e+042.02963e+04
WOA1.6036e+030.34391.9715e+053.3453e+051.7006e+041.33556e+04
F19F20F21
AvgStdAvgStdAvgStd
CLSOBBOA1.9026e+30.90605.1116e+31.5581e+33.6120e+32.0764e+3
BOA1.9389e+0318.05752.0606e+041.9813e+043.2084e+054.0276e+05
GWO1.9118e+032.14921.0008e+045.9314e+031.2467e+046.3934e+03
MFO1.9029e+030.82511.5952e+041.9123e+041.2523e+041.1755e+04
PSO1.9027e+031.38418.2582e+036.5637e+032.0928e+041.2878e+04
SCA1.9061e+031.01248.7350e+035.4998e+031.8935e+041.0503e+04
WOA1.9070e+031.90111.4986e+048.6110e+032.2521e+053.2664e+05
F22F23F24
AvgStdAvgStdAvgStd
CLSOBBOA2.230e+318.570325009.530302.5592e+038.6636
BOA2.4490e+0361.259225008.436502.5918e+0312.1467
GWO2.3164e+0361.76992.6324e+033.017322.5271e+0315.6449
MFO2.3047e+0375.20922.6347e+036.723522.5443e+0316.1036
PSO2.3058e+0338.26402.6294e+031.922e-072.522e+036.555
SCA2.2910e+0328.25352.6497e+038.066362.5582e+039.27922
WOA2.3114e+0381.51652.6191e+0351.81882.5903e+0321.0583
F25F26F27
AvgStdAvgStdAvgStd
CLSOBBOA2.682e+035.455612.7001e+030.09092.748e+030.469+e02
BOA2.6982e+039.49972.7034e+031.75183.0001e+031.5009e+02
GWO2.6953e+0317.14372.7001e+030.05633.0280e+031.1592e+02
MFO2.6991e+0317.16352.7002e+030.17853.0685e+031.2750e+02
PSO2.6918e+0334.44132.7001e+030.07362.9463e+031.6596e+02
SCA2.7004e+037.301562.7008e+030.19003.0131e+031.6786e+02
WOA2.6968e+039.232252.7004e+030.17863.0791e+032.0168e+02
F28F29F30
AvgStdAvgStdAvgStd
CLSOBBOA3.000e+031.790e-43.100e+031.6260e+034.1627e+038.4793e+02
BOA3.1976e+032.0829e+022.4659e+043.3954e+041.0799e+046.7561e+03
GWO3.2956e+0387.27678.5841e+051.0925e+064.4923e+037.4357e+02
MFO3.1988e+0336.68563.8029e+034.636e+023.795e+032.893e+02
PSO3.2615e+0365.29748.0351e+051.6493e+063.9944e+033.8509e+02
SCA3.2828e+0353.75551.0608e+046.0870e+035.0231e+031.0682e+03
WOA3.4616e+031.7564e+026.3032e+051.0588e+066.0717e+031.5832e+03
Table 5
Results of Wilcoxon signed rank test.
Fun.p-valueDecisionFun.p-valueDecision
F16.4e-10+F22.7e-8+
F34.4e-6+F42.4e-5+
F53.3e-5+F67.3e-6+
F74.8e-5+F86.2e-6+
F94.3e-4+F104.3e-8+
F115.1e-6+F122.4e-6+
F136.9e-4+F143.7e-5+
F152.4e-3+F162.2e-4+
F173.5e-4+F184.8e-5+
F191.3e-6+F203.8e-5+
F214.1e-6+F226.4e-6+
F236.7e-4+F244.7e-5+
F252.7e-3+F264.2e-4+
F272.5e-4+F284.6e-5+
F293.3e-6+F303.8e-5+
Convergence curve for all algorithms from F1–F10.
Fig 1

Convergence curve for all algorithms from F1–F10.

Convergence curve for all algorithms from F10–F20.
Fig 2

Convergence curve for all algorithms from F10–F20.

Convergence curve for all algorithms from F20–F30.
Fig 3

Convergence curve for all algorithms from F20–F30.

Furthermore, Figs 4 and 5 show the box plot for some functions: unimodal(F1 and F3), multi-modal(F4, F7, F9, F11, F13, and F16), hybrid (F18, F20, F21 and F22), and Composite functions(F25, F27, F28, and F30). It’s obvious that CLSOBBOA is more narrow than original BOA and it’s super narrow compared with other comparative metaheuristics algorithms.

Box plot for some unimodal and multi modal functions.
Fig 4

Box plot for some unimodal and multi modal functions.

Box plot for some hybrid and composite functions.
Fig 5

Box plot for some hybrid and composite functions.

3.2 Engineering problem

In order to evaluate a metaheuristics algorithm, a common approach is testing it on real constrained Engineering problems. These engineering problems have many equality and inequality. In addition, the optimal parameter values of almost engineering problems are unknown. In this subsection, 4 engineering optimization problems are used to test CLSOBBOA. These problems are welded beam engineering design, tension/compression spring, pressure vessel design, and Speed reducer design problem.

3.2.1 Welded beam design problem

This engineering problem proposed by Coello in [57] has 4 parameters. These parameters are design thickness of the weld h, clamped bar length l, bar thickness b, and the height of the bar t. The mathematical representation can be expounded in Appendix 6.1. Table 6 shows the results of CLSOBBOA compared with Animal Migration Optimization (AMO) [58], Water cycle algorithm (WCA) [59], Lightning search algorithm (LSA) [60], Symbiotic organisms search (SOS) [61], and Grey Wolf Optimizer (GWO) [62].

Table 6
Optimization results for welded beam design problem.
AlgorithmOptimization resultsCost
hltb
CLSOBBOA0.2057293.4704889.0366220.2057291.724852
AMO0.223 9603.591 0248.834 5150.223 9601.873 459
WCA0.205 7303.470 4899.036 6240.205 7301.724 852 315
LSA0.205 7303.470 4889.036 6230.205 7301.724 852 526
SOS0.205 7303.470 7459.036 3540.205 7441.724 953 103
GWO0.205 5873.475 0849.035 0060.205 8081.725 571 417

3.2.2 Tension/Compression spring

The second engineering constrained problem is called Tension/Compression spring proposed by Arora [63]. The main goal of this problem is to minimize the weight of design spring by find the optimal values for the 3 parameters: the diameter of the wire d, the mean diameter of the coil D and the active coil numbers N. Also, Appendix 6.2 gives its mathematical definition. Table 7 compares the results of CLSOBBOA algorithm with WCA, ABC [64], TLBO [65], and SOS.

Table 7
Optimization results for the tension/compression design problem.
AlgorithmOptimization resultsCost
dDN
CLSOBBOA0.051 6880.356 71511.289 1080.012 665
WCA0.051 7730.358 73411.171 7090.012 665
ABC0.052 7170.381 9299.951 8750.012 685 948
TLBO0.051 7900.359 14211.148 5390.012 665 851
SOS0.051 8080.359 57711.1250.012 667 638

3.2.3 Pressure vessel design

One of the most famous engineering problem is the pressure vessel design introduced by Kannan and Kramer in [66] which aims to minimize the cost of materials, welding, and forming This problem has 4 parameters: the thickness Ts, head‘s thickness Th, the inner radius R, and cylindrical length L. Mathematical definition of this problem is shown in Appendix 6.3. Results of CLSOBBOA compared to other state-of-art algorithms LSA, SOS, ABC and GWO is shown in Table 8.

Table 8
Optimization results for pressure vessel design problem.
AlgorithmOptimization resultsCost
TsThRLCost
CLSOBBOA0.778 1680.384 64940.319 6182005885.332 773
LSA0.843 6560.417 02043.712 76740.363 4646006.957 652
SOS0.779 2533.850 801157.609199.4585889.984 071
ABC7.781 6873.846 49240.319 6202005885.333 300
GWO0.778 9150.384 96040.342 6232005889.412 437

3.2.4 Speed reducer design problem

The last engineering problem introduced in this section is the speed reducer problem The objective of the function ids to fond the best parameter which are face weight, teeth on pinion number, teeth module, shaft length 1 between bearings and the shaft length 2 between bearings. The Mathematical representation is shown in Appendix 6.4. Table 9 compare the results of CLSOBBOA with GWO, AMO, WCA, and SOS.

Table 9
Optimization results for speed reducer design problem.
AlgorithmOptimization resultsCost
bmpl1l2d1d2Cost
CLSOBBOA3.5012600.7177.3807.833.332415.263452995.775
GWO3.5015910.7177.3917.823.351275.280742998.5507
AMO3.5067000.7177.3807.823.357845.276763001.944
WCA3.5002190.7178.3797.843.352415.286713005.222
SOS3.5384020.7177.3927.813.35805.286773002.928

3.3 CLSOBBOA in Feature Selection (FS)

In this subsection CLSOBBOA is used in order to solve FS using 5 different datasets.

3.3.1 CLSOBBOA architecture of FS

To be able to solve feature selection (FS), we regard it as a binary optimization since the solutions are limited to 0, 1 where “0” refers to the corresponding attribute hasn’t be selected whereas “1” is its contrary. To convert continous solution to binary one, a transfer function is needed. In this paper, we use sigmoid function as shown in the following equation

where xik refers to the position of i-th agent at dimension k.

The output from the previous equation is still continuous and to have binary-valued one, the following stochastic equation is used

FS fitness function is finding the small feature number and achieving the highest accuracy. So the FS fitness equation is as follows:

where γ(D) refers to error rate, C is the features total number, R is the length-size of selected features. α and β can be calculated as α ∈ [0, 1] and β = 1 − α

3.3.2 Experimental setup & results

Here, 5 different datasets from UCI have been used to evalute the CLSOBBOA performance in solving FS problem. The details of each dataset can be found in Table 10. The results of CLSOBBOA in solving FS problem. The results of CLSOBBOA compared with original BOA, PSO, and GWO are shown in Tables 1113 in terms of average fitness, feature size length, and classification accuracy. From these results, we can conclude the significant of CLSOBBOA in solving FS

Table 10
Descriptions of datasets.
SymbolDatasetNo. of featuresNo. of instances
DS1Breastcancer10699
DS2BreastEW31569
DS3WineEW14178
DS4segment202310
DS5Zoo17101
Table 11
Statistical mean fitness measure calculated for the compared algorithms on the different datasets.
DatasetCLSOBBOABOAPSOGWO
DS10.3000.4510.3560.416
DS20.0250.0560.0420.056
DS30.0100.0300.0140.022
DS40.0250.0430.0330.045
DS50.0080.0260.0130.031
Table 12
Average classification accuracy for the compared algorithms on the different datasets.
DatasetCLSOBBOABOAPSOGWO
DS10.9870.9400.9880.978
DS20.9510.9150.9850.962
DS30.9990.9810.9960.992
DS40.9850.9460.9840.977
DS50.9990.9810.9960.996
Table 13
Average selection size for the compared algorithms on the different datasets.
DatasetCLSOBBOABOAPSOGWO
DS13.43.83.64.6
DS25.412.412.915.7
DS32.65.23.76.1
DS44.17.66.49.1
DS53.16.14.36.5

4 Conclusion & future work

In this paper, a 3 variants of BOA algorithm have been introduced to improve its performance and preventing it from getting trapped in optimal subregion. These version merge the original BOA with Chaotic local search strategy and Opposition-based Learning concepts. The results show that the algorithm named CLSOBBOA have ranked first in more than half of CEC2014 benchmark functions. Although, the proposed algorithm tested using 4 different constrained engineering problems.

6 Appendix B

6.1 Welded beam design problem

Minimize: f1(x) = 1.10471 * x(1)2 * x(2) + 0.04811 * x(3) * x(4) * (14.0 + * x(2))

Subject to: g1(x) = τ − 13600

g2(x) = σ − 30000

g3(x) = x(1) − x(4)

g4(x) = 6000 − p

Variable Range

0.125 ≤ x1 ≤ 5

0.1 ≤ x2 ≤ 10

0.1 ≤ x3 ≤ 10

0.125 ≤ x4 ≤ 5

6.2 Tension/Compression spring design problem

Minimize: f(x)=(x3+2)x2x12

Subject to: g1(x)=1-(x23x3/71,785x14)0

g2(x)=(4x22-x1x2/12,566(x2x13-x14)+(1/5108x12))-100

g3(x)=1-(140.45x1/x22x3)0

g4(x) = (x2 + x1)/1.5 − 1 ≤ 0,

Variable Range

0.05 ≤ x1 ≤ 2.00

0.25 ≤ x2 ≤ 1.30

2.00 ≤ x3 ≤ 15.00

6.3 Pressure vessel design problem

Minimize: f(x)=0.6224x1x3x4+1.7781x2x32+3.1661x12x4+19.84x12x3

Subject to: g1(x) = −x1 + 0.0193x

g2(x) = −x2 + 0/00954x3 ≤ 0

g3(x)=-πx32x4-(4/3)πx33+1,296,0000

g4(x) = x4 − 240 ≤ 0

Variable Range

0 ≤ xi ≤ 100,   i = 1, 2

0 ≤ xi ≤ 200,   i = 3, 4

6.4 Speed reducer design problem

Minimize: f(x)=0.7854x1x22(14.9334x3+3.3333333x32-43.0934)+0.7854(x4x62+x5x72-1.508(x62+x72)

Subject to:

Variable Range

2.6 ≤ x1 ≤ 3.6

0.7 ≤ x2 ≤ 0.8

17 ≤ x3 ≤ 28

7.3 ≤ x4 ≤ 8.3

7.8 ≤ x5 ≤ 8.3

2.9 ≤ x6 ≤ 3.9

5 ≤ x7 ≤ 5.5

6.4.1 Gear train design problem

Minimize: f(x)=(16.931-x2x3x1x4)2

Variable Range

12 ≤ xi ≤ 60,   i = 1, 2, 3, 4

References

D. G.Luenberger, Y.Ye, et al, “Linear and nonlinear programming”, Vol. 2, Springer, 1984.

A. R.Simpson, G. C.Dandy, L. J.Murphy, “Genetic algorithms compared to other techniques for pipe optimization, Journal of water resources planning and management120 (4) (1994) 423443. 10.1061/(ASCE)0733-9496(1994)120:4(423)

Abdelazim G. Hussien, Aboul Ella Hassanien, and Essam H. Houssein. “Swarming behaviour of salps algorithm for predicting chemical compound activities.” 2017 Eighth International Conference on Intelligent Computing and Information Systems (ICICIS). IEEE, 2017.

Abdelazim G. Hussien, Essam H. Houssein, and Aboul Ella Hassanien. “A binary whale optimization algorithm with hyperbolic tangent fitness function for feature selection.” 2017 Eighth International Conference on Intelligent Computing and Information Systems (ICICIS). IEEE, 2017.

Abdelazim G.Hussien, et alS-shaped binary whale optimization algorithm for feature selectionRecent trends in signal and image processing. Springer, Singapore, 2019 7987.

J. H.Holland, “Genetic algorithms”, Scientific american 267 (1) (1992) 6673.

R.Storn, K.Price, “Differential evolutiona simple and efficient heuristic for global optimization over continuous spaces, Journal of global optimization11 (4) (1997) 341359. 10.1023/A:1008202821328

R. Eberhart, J. Kennedy, “A new optimizer using particle swarm theory”, in: MHS’95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Ieee, 1995, pp. 39-43.

D.Karaboga, B.Basturk, “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (abc) algorithm”, Journal of global optimization 39 (3) (2007) 459471. 10.1007/s10898-007-9149-x

10 

M. Dorigo, G. Di Caro, “Ant colony optimization: a new meta-heuristic“, in: Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406), Vol. 2, IEEE, 1999, pp. 1470-1477.

11 

S.Kirkpatrick, C. D.Gelatt, M. P.Vecchi, “Optimization by simulated annealing”, science 220 (4598) (1983) 671680. 10.1126/science.220.4598.671

12 

X.-S. Yang, S. Deb, Cuckoo search via “Lévy flights”, in: 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), IEEE, 2009, pp. 210-214.

13 

E.Rashedi, H.Nezamabadi-Pour, S.Saryazdi, “Gsa: a gravitational search algorithm, Information sciences 179 (13) (2009) 22322248. 10.1016/j.ins.2009.03.004

14 

A. G. Hussien and M. Amin and M. Wang and G. Liang and A. Alsanad and A. Gumaei and H. Chen, “Crow Search Algorithm: Theory, Recent Advances, and Applications”, IEEE Access (2020).

15 

S.Mirjalili, “Dragon fly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems”, Neural Computing and Applications 27 (4) (2016) 10531073. 10.1007/s00521-015-1920-1

16 

D.Simon, “Biogeography-based optimization”, IEEE transactions on evolutionary computation 12 (6) (2008) 702713. 10.1109/TEVC.2008.919004

17 

A. H.Gandomi, X.-S.Yang, A. H.Alavi, S.Talatahari, Bat algorithm for constrained optimization tasks”, Neural Computing and Applications 22 (6) (2013) 12391255. 10.1007/s00521-012-1028-9

18 

S.Mirjalili, A.Lewis, “The whale optimization algorithm”, Advances in engineering software 95 (2016) 5167. 10.1016/j.advengsoft.2016.01.008

19 

S.Saremi, S.Mirjalili, A.Lewis, “Grasshopper optimisation algorithm: theory and application”, Advances in Engineering Software 105 (2017) 3047. 10.1016/j.advengsoft.2017.01.004

20 

G.Dhiman, V.Kumar, “Emperor penguin optimizer: A bio-inspired algorithm for engineering problems”, Knowledge-Based Systems 159 (2018) 2050. 10.1016/j.knosys.2018.06.001

21 

M.Jain, V.Singh, A.Rani, “A novel nature-inspired algorithm for optimization: Squirrel search algorithm, Swarm and evolutionary computation44 (2019) 148175. 10.1016/j.swevo.2018.02.013

22 

G.Dhiman, V.Kumar, “Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems”, Knowledge-Based Systems 165 (2019) 169196. 10.1016/j.knosys.2018.11.024

23 

Z. Wei, C. Huang, X. Wang, T. Han, Y. Li, “Nuclear reaction optimization:A novel and powerful physics-based algorithm for global optimization”, IEEE Access.

24 

S.Mirjalili, A. H.Gandomi, S. Z.Mirjalili, S.Saremi, H.Faris, S. M.Mirjalili, “Salp swarm algorithm: A bio-inspired optimizer for engineering design problems”, Advances in Engineering Software 114 (2017) 163191. 10.1016/j.advengsoft.2017.07.002

25 

A. A.Heidari, S.Mirjalili, H.Faris, I.Aljarah, M.Mafarja, & H.Chen (2019). “Harris hawks optimization: Algorithm and applications”. Future generation computer systems, 97, 849872. 10.1016/j.future.2019.02.028

26 

S.Li, H.Chen, M.Wang, A. A.Heidari, & S.Mirjalili (2020). “Slime mould algorithm: A new method for stochastic optimization”. Future Generation Computer Systems.

27 

F. A.Hashim, E. H.Houssein, M. S.Mabrouk, W.Al-Atabany, & S.Mirjalili (2019). “Henry gas solubility optimization: A novel physics-based algorithm”. Future Generation Computer Systems, 101, 646667. 10.1016/j.future.2019.07.015

28 

G. G. Wang, S. Deb, & L. D. S. Coelho (2015, December). “Elephant herding optimization”. In 2015 3rd International Symposium on Computational and Business Intelligence (ISCBI) (pp. 1-5). IEEE.

29 

A. S. Assiri, A. G. Hussien, & M. Amin (2020). “Ant Lion Optimization: variants, hybrids, and applications”. IEEE Access, 8, 77746-77764.

30 

A. G.Hussien, M.Amin, & M.Abd El Aziz, (2020). “A comprehensive review of moth-flame optimisation: variants, hybrids, and applications”. Journal of Experimental & Theoretical Artificial Intelligence, 121.

31 

S.Arora, S.Singh, “Butterfly optimization algorithm: a novel approach for global optimization, Soft Computing 23 (3) (2019) 715734. 10.1007/s00500-018-3102-4

32 

K.Aygül, M.Cikan, T.Demirdelen, M.Tumay, “Butterfly optimization algorithm based maximum power point tracking of photovoltaic systems under partial shading condition”, Energy Sources, Part A: Recovery, Utilization, and Environmental Effects (2019) 119.

33 

D. K. Lal, A. Barisal, S. D. Madasu, “AGC of a two area nonlinear power system using BOA optimized FOPID+ PI multistage controller”, in: 2019 Second International Conference on Advanced Computational and Communication Paradigms (ICACCP), IEEE, 2019, pp. 1-6.

34 

S.Arora, P.Anand, “Learning automata-based butterfly optimization algorithm for engineering design problems”, International Journal of Computational Materials Science and Engineering 7 (04) (2018) 1850021 10.1142/S2047684118500215

35 

G.Li, F.Shuang, P.Zhao, C.Le, “An improved butterfly optimization algorithm for engineering design problems using the cross-entropy method”, Symmetry 11 (8) (2019) 1049 10.3390/sym11081049

36 

S.Arora, & P.Anand (2019). “Binary butterfly optimization approaches for feature selection”. Expert Systems with Applications, 116, 147160. 10.1016/j.eswa.2018.08.051

37 

B. Zhang, X. Yang, B. Hu, Z. Liu, & Z. Li (2020). “OEbBOA: A Novel Improved Binary Butterfly Optimization Approaches With Various Strategies for Feature Selection”. IEEE Access, 8, 67799-67812.

38 

Y. Fan, J. Shao, G. Sun, & X. Shao (2020). “A Self-adaption Butterfly Optimization Algorithm for Numerical Optimization Problems”. IEEE Access.

39 

Y.Guo, X.Liu, & L.Chen (2020). “Improved butterfly optimisation algorithm based on guiding weight and population restart”. Journal of Experimental & Theoretical Artificial Intelligence, 119. 10.1080/0952813X.2020.1725651

40 

Z. Wang, Q. Luo, & Y. Zhou (2020). “Hybrid metaheuristic algorithm using butterfly and flower pollination base on mutualism mechanism for global optimization problems”. ENGINEERING WITH COMPUTERS.

41 

A.Toktas, & D.Ustun (2020). “A Triple-Objective Optimization Scheme using Butterfly-integrated ABC Algorithm for Design of Multi-Layer RAM. IEEE Transactions on Antennas and Propagation”. 10.1109/TAP.2020.2981728

42 

S.Sharma, A. K.Saha, “m-mboa: a novel butterfly optimization algorithm enhanced with mutualism scheme”, Soft Computing (2019) 119.

43 

H.-J. Meng, P. Zheng, R.-Y. Wu, X.-J. Hao, Z. Xie, “A hybrid particle swarm algorithm with embedded chaotic search”, in: IEEE Conference on Cybernetics and Intelligent Systems, 2004., Vol. 1, IEEE, 2004, pp.367-371.

44 

B.Shaw, V.Mukherjee, S.Ghoshal, “A novel opposition-based gravitational search algorithm for combined economic and emission dispatch problems of power systems”, International Journal of Electrical Power & Energy Systems 35 (1) (2012) 2133. 10.1016/j.ijepes.2011.08.012

45 

A. R. Malisia, H. R. Tizhoosh, Applying opposition-based ideas to the ant colony system, in: 2007 IEEE Swarm Intelligence Symposium, IEEE, 2007, pp. 182-189.

46 

S.Gupta, K.Deep, An opposition-based chaotic grey wolf optimizer for global optimisation tasks, Journal of Experimental & Theoretical Artificial Intelligence 31 (5) (2019) 751779. 10.1080/0952813X.2018.1554712

47 

S.Rahnamayan, H. R.Tizhoosh, M. M.Salama, “Opposition versus randomness in soft computing techniques”, Applied Soft Computing 8 (2) (2008) 906918. 10.1016/j.asoc.2007.07.010

48 

M.Hasegawa, T.Ikeguchi, K.Aihara, K.Itoh, “A novel chaotic search for quadratic assignment problems”, European Journal of Operational Research 139 (3) (2002) 543556. 10.1016/S0377-2217(01)00189-8

49 

B.Alatas, “Chaotic bee colony algorithms for global numerical optimization”, Expert Systems with Applications 37 (8) (2010) 56825687. 10.1016/j.eswa.2010.02.042

50 

I.Saccheri, M.Kuussaari, M.Kankare, P.Vikman, W.Fortelius, I.Hanski, “Inbreeding and extinction in a butterfly metapopulation”, Nature 392 (6675) (1998) 491 10.1038/33136

51 

H. R. Tizhoosh, “Opposition-based learning: a new scheme for machine intelligence”, in: International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCAIAWTIC’ 06), Vol. 1, IEEE, 2005, pp. 695-701.

52 

I.FisterJr, X.-S.Yang, J.Brest, D.Fister, I.Fister, “Analysis of randomisation methods in swarm intelligence”, International journal of bioinspired computation 7 (1) (2015) 3649. 10.1504/IJBIC.2015.067989

53 

I.Fister, X.-S.Yang, J.Brest, “On the randomized firefly algorithm”, in: Cuckoo Search and Fire y Algorithm, Springer, 2014, pp. 2748.

54 

Abdelazim G.Hussien, et alNew binary whale optimization algorithm for discrete optimization problems.” Engineering Optimization (2019): 115.

55 

F.Wilcoxon (1992). Individual comparisons by ranking methods In Breakthroughs in statistics (pp. 196202). Springer, New York, NY.

56 

A. G.Hussien, D.Oliva, E. H.Houssein, A. A.Juan, X.Yu, “Binary Whale Optimization Algorithm for Dimensionality Reduction”, Mathematics, 8(10), (2020), 1821 10.3390/math8101821

57 

C. A. C.Coello, “Use of a self-adaptive penalty approach for engineering optimization problems”, Computers in Industry 41 (2) (2000) 113127. 10.1016/S0166-3615(99)00046-9

58 

X.Li, J.Zhang, M.Yin, “Animal migration optimization: an optimization algorithm inspired by animal migration behavior”, Neural Computing and Applications 24 (7-8) (2014) 18671877. 10.1007/s00521-013-1433-8

59 

H.Eskandar, A.Sadollah, A.Bahreininejad, M.Hamdi, “Water cycle algorithm-a novel metaheuristic optimization method for solving constrained engineering optimization problems, Computers & Structures 110 (2012) 151166. 10.1016/j.compstruc.2012.07.010

60 

H.Shareef, A. A.Ibrahim, A. H.Mutlag, “Lightning search algorithm”, Applied Soft Computing 36 (2015) 315333. L. Abualigah, M. Abd Elaziz, A. G. Hussien, B. Alsalibi, S. M. J. Jalali, A. H. Gandomi, “Lightning search algorithm: a comprehensive survey”, Applied Intelligence,(2020) 1-24. 10.1016/j.asoc.2015.07.028

61 

M.-Y.Cheng, D.Prayogo, “Symbiotic organisms search: a new metaheuristic optimization algorithm”, Computers & Structures 139 (2014) 98112. 10.1016/j.compstruc.2014.03.007

62 

S.Mirjalili, S. M.Mirjalili, A.Lewis, “Grey wolf optimizer”, Advances in engineering software 69 (2014) 4661. 10.1016/j.advengsoft.2013.12.007

63 

J. S.Arora, “Introduction to optimum design”, Elsevier, 2004.

64 

B. Basturk, “An artificial bee colony (abc) algorithm for numeric function optimization”, in: IEEE Swarm Intelligence Symposium, Indianapolis, IN, USA, 2006, 2006.

65 

F.Zou, L.Wang, X.Hei, D.Chen, “Teaching-learning-based optimization with learning experience of other learners and its application”, Applied Soft Computing 37 (2015) 725736. 10.1016/j.asoc.2015.08.047

66 

B.Kannan, S. N.Kramer, “An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design”, Journal of mechanical design 116 (2) (1994) 405411. 10.1115/1.2919393