Abstract
Metaheuristic algorithms play a vital role in addressing a wide range of real-world problems by overcoming hardware and computational constraints. The Chameleon Swarm Algorithm (CSA) is a modern metaheuristic algorithm that uses how chameleons act. To improve the capabilities of the CSA, this work proposes a modified version of the Chameleon Swarm Algorithm to find better optimal solutions applicable to various application areas. The effectiveness of the proposed algorithm is assessed using 97 typical benchmark functions and three real-world engineering design problems. To validate the efficacy of the proposed algorithm, it has been compared to a number of well-known and widely-used classical algorithms, the Gravitational Search Algorithm, the Earthworm Optimization. The proposed modified Chameleon Swarm Algorithm using Morlet wavelet mutation and Lévy flight (mCSAMWL) is superior to existing algorithms for both unimodal and multimodal functions, as demonstrated by Friedman’s mean rank test as well as three real world engineering design problems. Five performance metrics—average energy consumption, total energy consumption, total residual energy, dead node and cluster head frequency are taken into consideration when evaluating the performances against state-of-the-art algorithms. For nine different simulation scenarios, the proposed algorithm mCSAMWL outperforms the Atom Search Optimization (ASO), Hybrid Particle Swarm Optimization and Grey Wolf Optimization (PSO-GWO), Bald Eagle Search Algorithm (BES), the African Vulture Optimization Algorithm (AVOA), and the Chameleon Swarm Algorithm (CSA) in terms of average energy consumption and total energy consumption by 50.9%, 52.6%, 45%, 42.4%, 50.1% and 51.4%, 53.3%, 45.6%, 42.4%, 50.7%.
Similar content being viewed by others
Introduction
Metaheuristic algorithms are effective solutions that can be utilized for a wide variety of engineering challenges that are encountered in the real world1. Compared to deterministic approaches, metaheuristic algorithms have excelled in recent decades due to their adaptability, ability to prevent local optima, and gradient-free framework. Deterministic approaches get identical results for the same problem. This behavior may result in local optimum trapping, which is a drawback of deterministic techniques2. Local optima trapping denotes an algorithm becoming stuck in local solutions. As a result, it is unable to find a global solution. Because of their inconsistent performance, deterministic methods can no longer be relied upon for solving practical optimization problems with several possible solutions. Most of these algorithms are derived from observations of natural phenomena, such as the intelligence of swarms of particles, the logic of biologically inspired algorithms, the physics of everyday objects, etc. Evolutionary algorithms mimic nature. The fittest candidate survives in evolutionary algorithms. These algorithms start with a population of solutions surviving in a fitness-evaluated environment. Then, genetic crossover and mutation help the parent population pass on its environmental adaptations to the offspring. Finally, iterative generations are used to find the best environmental solutions. Genetic Algorithm, Biogeography-Based Optimization (BBO), Genetic Programming (GP), Differential Evolution (DE), and Evolution Strategies (ES), etc., are the evolutionary algorithms.
The notions and norms of physics are adhered to in physics-based procedures, in which person update their positions according to physical laws such as molecule dynamics, the force of inertia, the force of gravitation, etc., the Atom Search Optimization, the Simulated Annealing, the Artificial Electric Field Algorithm, and the Sine Cosine Optimization, etc. are all well-known methods based on physical principles. Natural metaheuristics inspired by the “collective intelligence” of swarms are referred to as “swarm intelligence”. Collective knowledge is developed when a group of similar agents collaborate and learn from their surroundings. Colonies of ants, swarms of bees, dense flocks of birds, and many other groups of animals have been used as examples of collective intelligence. The coordinated flocking of birds inspired the concept of Particle Swarm Optimization. Fireflies’ flashing habits inspired the firefly algorithm’s development. The Bat Algorithm (BA) is a nature inspired algorithm that employs a sophisticated echolocation-based navigation system. The Ant Colony Optimization was based on the way that real-life ant colonies lay down pheromone trails. Cuckoo Search (CS) is an evolutionary optimization method inspired by the behavioral patterns of the cuckoo bird. Among the most prominent are: Fruit Fly Optimization Algorithm (FFOA), the Ant Colony Optimization (ACO), the Grasshopper Optimization Algorithm (GOA), the Salp Swarm Algorithm (SSA), the Whale Optimization Algorithm (WOA), the African Vulture Optimization Algorithm (AVOA), the Glowworm Swarm Optimization (GSO), and Cat Swarm Optimization (CSO) etc. Existing CSA studies have several problems, such as insufficient diversity, local optima problems, and imbalanced exploitation. Each previously mentioned optimization algorithm must also consider how to best explore and exploit a given search space. Exploration and exploitation3 are the two distinct stages that make up the search process for an algorithm that is based on a population. The term “exploration” refers to the process of increasing the number of swarms in order to more thoroughly study every part of the search space, whereas the term “exploitation” describes the process of increasing the number of swarms to more thoroughly analyze any promising or intriguing locations that were discovered during the exploration phase. Stochastic behaviour makes it difficult to achieve equilibrium among the exploration and exploitation phases.
An effective optimization algorithm will strike a balance between exploring and exploiting the space. It is not guaranteed that an algorithm will be superior on all problems just because it performs well on some problems. It serves as inspiration for this research. The Chameleon Swarm Algorithm (CSA) is vulnerable to becoming trapped in local optima. The optimizer may be unable to locate the global solution because it is trapped in the local region. Generating new solutions is estimated using the solutions inherent to the previous iteration. Consequently, this may reduce the algorithm’s convergence rate, resulting in solutions that do not effectively encompass the entire search space and premature convergence. Considering this as a motivation, the mCSAMWL algorithm is proposed in this study as an improved version of the CSA, to increase the search capability, optimize the balance between exploitation and exploration phases, and prevent early convergence of the local optimum. mCSAMWL’s guiding principle is founded on the injection of two effective strategies into the original CSA: Morlet Wavelet mutation, and the Step Reducer Lévy flight. Swarm intelligence is the foundation of the Chameleon Swarm Algorithm (CSA). Recent advancements in optimization algorithms, particularly those utilizing Levy-based search techniques, have led to the development of several innovative approaches in the field such as the modified version of Dynamic Hunting Leadership (DHL) algorithm, have incorporated the Levy Flight technique to enhance convergence and solution precision. The mDHL algorithm4, which also addresses the challenges of local optima and convergence delays, integrates this technique with localized development strategies to improve global exploration and exploitation. The DGS-SCSO optimizer5, an enhanced version of Sand Cat Swarm Optimization (SCSO), incorporates Dynamic Pinhole Imaging and the Golden Sine Algorithm to mitigate issues like local optima entrapment and slow convergence. Similarly, AEFA-CSR6, a hybrid of the Artificial Electric Field Algorithm with Cuckoo Search and Refraction Learning, improves convergence and solution precision, showing superior performance across benchmark functions and engineering problems.
Related work
CSA has attracted significant attention from researchers due to its simple architecture and ease of implementation. To further enhance its functionality, numerous concepts and approaches have been introduced. This section offers an overview of CSA’s development and examines its applications in solving challenges across various domains. Sridharan7 developed a Chameleon Swarm Optimization (CSO) with machine learning- based Sarcasm Detection and Classification (CSOML-SASC) model. Umamageswari et al.8 introduced a framework using the Fuzzy C-Means (FCM) based Chameleon Swarm Algorithm (CSA) named FCM-CSA, which was used for plant leaf diseased part segmentation. RizkAllah and Hameed9 suggested a Chameleon Swarm Algorithm (MCSA) that extracts parameters from solid oxide fuel cell models using a semi-empirical and memory-based approach. Anitha et al.10 introduced a Modified Grey Wolf-based Chameleon Swarm Algorithm to minimize energy consumption and enable secure wireless sensor network communication. RizkAllah et al.11 introduced a hybrid approach comprising the Chameleon Swarm Algorithm(CSA) and Mayfly Optimization (MO) named CSMO for solving the Combined Heat and Power Economic Dispatch (CHPED) problem. Mostafa et al.12 proposed a modified mCSA algorithm that incorporates an Artificial Ecosystem-Based Optimization (AEO) consumption operator. Using a multi-objective chameleon swarm optimization algorithm and an advanced feature-selection method, Wang et al.13 introduced a short term wind speed forecasting system. A Multi strategy Chameleon Swarm Algorithm called (MCSA) was developed by Hu et al.14 using a Crossover-based Comprehensive Learning (CCL) strategy incorporating sinusoidal parameter tuning and fractional-order calculus. RMSCSA, which was based on the Refraction Mirror Learning (RML) method to promote variety and segmental variation of population diversity using S-type weight, was presented by Damin et al.15. To handle non-convex Economic Load Dispatch (ELD) problems, an Enhanced Chameleon Swarm Algorithm (ECSA) was developed by Braik et al.16 that combines roulette wheel selection with Lévy flight approaches. A hybrid variant of CSA named CCECSA was suggested by Hu et al.17 in which mutation operations and elite guidance strategies were used. Also, CSA incorporated the horizontal and vertical crossovers of CSO to solve the disc Wang-Ball curve (DWB) reduced-degree optimization models. Sun et al.18 introduced an improved Chameleon Swarm Algorithm called CLCSA-LSTM enhanced by using the Somersault Foraging Technique of the Manta Ray Foraging Optimization algorithm (MRFO), a boundary neighborhood updating method to maintain a demographically diversified population. It initially optimizes LSTM network hyper parameters and finds the optimal ones to tackle the manual tuning process and the insufficient stability problem. This model was used to recognize OFDM signals after being trained with the aforementioned hyper parameters. Zhou and Xu19 find that the optimal size of each component is determined based on the actual local hourly weather data and the load demand over the course of a year using the Chameleon swarm algorithm (CSA) for the framework of the renewable micro-grid system. Dinh20 used CSA to build an algorithm that enhanced the image and synthesized the high-frequency layer. Table 1 gives an overview of the most recently proposed modifications that have been suggested for the CSA algorithm.
New advancements in optimization have led to the development of hybrid techniques aimed at improving performance and robustness. Abed-alguni introduced21 island-based Cuckoo Search (iCSPM) algorithm which improves population diversity and exploration by integrating an island model and replacing Levy flight with polynomial mutation, outperforming other methods in accuracy and reliability across standard benchmarks. The iCSPM2 algorithm introduced by Abed-alguni & Paul22 further enhances iCSPM by incorporating Elite Opposition-based Learning and multiple mutation strategies, such as HDP, Jaya, and pitch adjustment, achieving better accuracy, convergence, and surpassing four well-known swarm optimization algorithms in benchmark tests. The Exploratory Cuckoo Search (ECS) proposed by Abed-alguni et al.23 improves the original Cuckoo Search by integrating refraction learning, Gaussian perturbation, and multiple mutation methods, outshining traditional CS variations in 14 benchmark functions and exhibiting competitive performance compared to six renowned swarm optimization algorithms. Similarly, the Improved Salp Swarm Algorithm (ISSA)24 introduced by Abed-alguni et al. boosts the SSA’s optimization capabilities with Gaussian Perturbation, highly disruptive polynomial mutation, Laplace crossover, and Mixed Opposition-based Learning, outperforming other SSA variants and 18 top optimization algorithms in solving single-objective continuous optimization problems.
From the above table, it can be deduced that a few studies have modified the standard CSA algorithm by introducing various techniques to solve some engineering problems, but it still has scope to improve further. Therefore, to improve the original algorithm’s flaws, we proposed an improved Chameleon Swarm Algorithm using Morlet wavelet mutation and Lévy flight, named mCSAMWL.
The primary contributions of this paper are as follows:
-
A modified CSA (Chameleon Swarm Algorithm) is proposed, developed and applied that combines the features of the Morlet Wavelet mutation and Lévy flight methods to keep the balance between searching capabilities by preventing the local optimal solution and slow convergence problems.
-
The performance of the proposed mCSAMWL algorithm is established by evaluating it using 68 unimodal and multimodal benchmark test functions, CEC 2017 test suite functions and three real-world engineering design problems.
-
A Clustering technique is implemented using the above metaheuristic algorithm, proving its efficiency.
The remainder of this paper is organized as follows: Sect. 2 reviews related work; Sect. 3 describes the materials and methods; Sect. 4 presents an empirical evaluation using 97 benchmark functions; Sect. 5 discusses a real-world engineering design problem; Sect. 6 presents results of the proposed mCSAMWL algorithm for balanced clustering in WSNs; and Sect. 7 concludes the paper.
Materials and methods
In 2021, Braik25 introduced a meta-heuristic algorithm, the Chameleon Swarm Algorithm (CSA). This algorithm is based on the way a chameleon hunts and searches for food. Chameleons are a distinct species of animal due to their ability to adapt to their environment. Chameleons eat insects and can survive in alpine, lowland, desert, and semi-arid environments. Chameleons search for food through a series of processes, including locating their target, tracking it with their eyes, and finally attacking it. This section explains how to model this algorithm mathematically.
Initialization and function assessment
CSA begins the optimization process with the initial population as a population-based algorithm. In a \(\:d\)-dimensional search space, a population of \(\:n\) chameleons, where each chameleon represents a potential solution to a problem, can be represented by a \(\:z\) -matrix of size\(\:\:n\times\:d\). As demonstrated below, a vector can be used to describe where chameleon \(\:h\) is in the search domain at each iteration \(\:itr\).
where, \(\:h=\text{1,2},3\dots\:.n.itr\:\)denotes the count of iterations. \(\:d\) denotes the problem dimension and \(\:{z}_{itr,d}^{h}\) denotes the \(\:{h}^{th}\)chameleon’s location. Using a uniform random initialization method, the search space’s initial population is generated while considering the problem’s dimensions and the number of chameleons, which is shown in Eq. (2).
where, \(\:{z}^{h}\) denotes \(\:{h}^{th}\)chameleon initial vector, \(\:rnd\) is a random value between 0 and 1 and \(\:ub\:\)and\(\:\:lb\:\)denotes search area’s upper and lower boundaries in \(\:{m}^{th}\) dimension. At each stage, the effectiveness of the solution is evaluated with the help of the objective function.
Search of the prey
The position update approach put forth below can be used to represent the chameleon’s movement while foraging mathematically as in Eq. (3).
where, \(\:{z}_{itr+1}^{h,m}\) is the \(\:{h}^{th}\)chameleon’s new position in the \(\:{m}^{th}\)dimension in the \(\:{\left(itr+1\right)}^{th}\) iteration, \(\:itr\:\)and \(\:\left(itr+1\right)\) represent the \(\:{itr}^{th}\)iteration count and \(\:{itr+1}^{th}\) iteration count respectively. \(\:h\) and \(\:d\) are the \(\:{d}^{th}\)dimensionof \(\:{h}^{th}\) chameleon.\(\:\:{z}_{i}^{h,m}\:\)represents the current position. The best and global best are \(\:{B}_{itr}^{h,m}\) and \(\:{G}_{itr}^{h,m}\). \(\:{B}_{1}\) and \(\:{B}_{2}\) govern the ability of exploration. \(\:{rnd}_{1}\),\(\:{rnd}_{2}\:\)and\(\:{\:rnd}_{3}\) denotes random uniform numbers between 0 and 1. \(\:{rnd}^{itr}\) is an index-based random number and \(\:{B}_{t}\) is the chameleon’s prey-recognition probability.
Chameleon’s eyes rotation
Chameleons have a characteristic in their eyes that allows them to rotate at 360°, allowing them to see in all directions, and monitor their prey’s presence. As a result, the position of each chameleon is adjusted so that it corresponds with this function as shown below:
where, \(\:{z}_{itr+1}^{h}\) the position after rotation, and \(\:{z}_{itr}^{h}\) is the current location prior to rotation.\(\:{\:zrc}_{i}^{h}\)denotes the chameleon’s search space rotational coordinates, as illustrated in Eq. (5).
where, \(\:rm\) represents the rotational matrix which shows the rotation of chameleons, \(\:{zrc}_{itr}^{h}\) denotes centering coordinates at iteration \(\:itr.\)
Hunting prey
Chameleons complete their hunts by ambushing their prey when the target predator gets too close to them. The chameleon that can approach its prey the most successfully is considered the best of the group, and optimal. This chameleon attacks prey using its tongue. As a result of being capable of extending its tongue twice as much, its position has been revised slightly. This permits chameleons to take advantage of the search space and catch their prey, which is mathematically described as follows:
where,\(\:{V}_{itr+1}^{h,m}\) is the \(\:{h}^{th}\)chameleon’s new speed in the \(\:{m}^{th}\)dimension of iteration, \(\:{V}_{itr}^{h,m}\) is \(\:{h}^{th}\)chameleon’s current speed in the \(\:{d}^{th}\)dimension. The \(\:{i}^{th}\)chameleon’s current location is denoted by \(\:{pop}_{i}^{h,m}\) and the effects of \(\:{z}_{itr}^{h,m}\) and \(\:{G}_{itr}^{m}\) on its tongue are regulated by two positive constant integers, \(\:{c}_{1}\) and \(\:{c}_{2}\). Here,\(\:{\:rnd}_{1}\)and \(\:{rnd}_{2}\:\)are two arbitrary numbers chosen from the range \(\:0-1\), and the inertia weight, denoted by\(\:\omega\:\), decreases linearly with every successive generation, as shown in the below formula.
where, \(\:itr\:\)denotes the present iteration, \(\:\text{m}\text{a}\text{x}\_\text{i}\text{t}\text{r}\:\)denotes the total number of iterations and positive variable \(\:\rho\:\) controls the exploitation capacity. The CSA algorithm demonstrates how the chameleon’s initial positions in the search space are created at random as an integral component of the optimization process. Equation (3) is used to update the chameleons’ positions in each iteration cycle. If a chameleon escapes the search region, the simulated procedures specified for CSA will be used to bring it back to the boundary. Next, a fitness function is used to determine which chameleon is the most fit after each iteration. The best position of a chameleon in its pursuit of prey is known as the fittest solution. Following the initialization step, Algorithm 1 iterates through the remaining steps until the maximum criteria is reached. According to a swarm behavior model created by the CSA, chameleons constantly hunt for and take advantage of both fixed and moving prey in their environment before moving in to capture it. The optimization potential of the CSA should be displayed via its mathematical models.
Chameleon swarm optimization
The swarm-based metaheuristic Chameleon swarm algorithm (CSA)25 was proposed by Braik in 2021.CSA mimic chameleon hunting and food finding. This method is based on how the chameleon hunts and looks for food. Chameleon food hunting involves several processes, including prey tracking, chasing the prey with their sight, and quickly attacking the prey with their long, sticky tongue. The fact that it is easy to operate and has a limited number of adjustment parameters are two of its positives; nonetheless, it is not very effective in resolving high-mode or multi-mode issues. The conventional CSA algorithm also suffers from insufficient population diversity, a sluggish convergence rate, and a low degree of precision. For this reason, a new modified Chameleon Swarm Algorithm incorporating the Morlet Wavelett mutation and Lévy Flight factor (mCSAMWL) is proposed. Finally, a modified Chameleon Swarm Algorithm is employed to tackle global optimization problems. The effectiveness of the algorithm has been measured against 68 benchmark test functions and three real world engineering design problems.
Concept of Morlet wavelet mutation
A physicist, Morlet, came up with the term “Morlet Wavelet” when he examined a seismic signal that had been transformed by a cosine function26.Wavelet mutation is used to improve algorithm stability. In addition, wavelet mutation operations exhibit a fine-tuning ability. The CSA is vulnerable to becoming ensnared in local optima, preventing the algorithm from exploring the complete search space. In this work, Morlet wavelet mutation is used to enhance the exploration stage, the accuracy of the search, and the stability of solutions. The straightforward mutation approach does not easily solve the stagnation phenomenon. The key to this advancement is figuring out how to enhance the conventional mutation approach so that it can overcome the local optimum. Wavelet mutation uses the wavelet function’s translation and expansion capabilities to look for other solutions in a feasible space that are close to the ones already known to be correct for a set of persons. To further fine-tune the mutation range with each iteration change, the wavelet function’s stretching parameters can be modified to decrease the function’s amplitude. As a result, wavelet mutation is used in place of the traditional mutation algorithm. A mutation probability, \(\:mp \epsilon \left[\text{0,1}\right],\) is determined for each particle in the swarm at each iteration. If \(\:mp\) is positive (\(\:mp>0\)) and getting close to 1, the mutated particle elements will tend toward the maximum value of \(\:{z}_{itr+1}^{h,m}\). If \(\:mp\) is negative (\(\:mp<0\)) and getting close to -1, the mutated particle element will tend toward the lowest value of \(\:{z}_{itr+1}^{h,m}\).When |mp| is big, the search space for fine-tuning is big, and vice versa, when |mp| is small, the search space for fine-tuning is small. The formula for mutation is:
where,\(\:{z}_{itr+1}^{h,m}\left(h=\text{1,2},\ldots,N\right)\) denotes the \(\:{h}^{th}\) individual location in \(\:{itr}^{th}\) iteration, \(\:lb\) denotes the lower bound, \(\:ub\) denotes the upper bounds of the present search space. Similarly, \(\:mp\) represents mutating wavelet coefficient as given in Eq. (9).
where, \(\:aa\) is the stretching parameter, which increases with the change of iterations. Its expression is given in Eq. (10).
where, \(\:ss\) indicates a given constant value. \(\:itr\) denotes the present iteration and \(\:{max}_{itr}\) denotes the total number of iterations.
Morlet Wavelet function \(\:mw\) is expressed as in Eq. (11).
where, \(\:numb\)denotes a random number between − 2.5\(\:aa\:\)and 2.5\(\:aa\).
This strategy ensures that an individual with superior fitness will enter the next iteration, thereby enhancing the algorithm’s convergence speed and optimization capability.
Lévy flight distribution
The Levy Flight is an example of a stochastic search algorithm introduced by Paul Pierre Levy in the year 193026,27 which uses a random walk to revise its results. Step walks are characterized as random walks with a certain probability distribution. The step sizes of Lévy flights are too ornery, by altering the step size, they can be used for both exploration and exploitation. proposed strategy generates the step sizes using Lévy distribution to exploit the search area. While exploring new solutions, controlling the Lévy flight random walks is necessary, to avoid large moves that causing the solutions to jump outside of the search space. For this reason, a step size factor that is determined by the size of the relevant problem should be used step size controller with a default value of 0.005 has been put in place to minimize the impact of Lévy flight on the beginning positions and enable searching around the produced positions. To generate numbers that are random with a Lévy Flight distribution, Eq. (12) has been examined as follows:
where, \(\:\mu\:\:\)and \(\:v\:\)have a Gaussian distribution, \(\:\gamma\:\) is step reducer factor having fixed value of 0.005.
where,\(\:\beta\:=1.5,{\sigma\:}_{v}=1\), and a classic gamma function is denoted by the symbol\(\:\:\varGamma\:\).
In Eq. (15), the Lévy flight procedure is presented which is used to update the chameleon positions:
where, \(\:L\) represents the Lévy flight distribution.
Computational complexity analysis of the proposed algorithm
The operational efficiency of the proposed mCSAMWL algorithm, with respect to time and space complexity, is discussed in this section.
Time complexity
The time complexity of the algorithm is influenced by the population size \(\:\left(n\right),\:\)the variable dimension \(\:\left(d\right),\) and the number of iterations \(\:\left(itr\right).\) For the original Chameleon Swarm Algorithm (CSA), the primary factors contributing to time complexity are the initialization and update processes of chameleon positions (including prey searching, tracking, and capturing). This can be expressed as:
The proposed mCSAMWL extends the CSA by incorporating parameter adjustment, Morlet Wavelet Mutation and Levy Flight with step reducer strategy in each iteration. However, only the Levy Flight Distribution strategy affects the algorithm’s time complexity. Therefore, the time complexity of proposed mCSAMWL algorithm is
Thus, the proposed MCSA achieves different performance compared to CSA without increasing time complexity.
Empirical evaluation
MATLAB R2018b is used to examine the efficiency and capabilities of the proposed modified Chameleon Swarm Algorithm using Morlet Wavelet and Lévy Flight (mCSAMWL) algorithm. The “Intel(R), Core i7-4790 CPU@3.60GHz with 8 GB RAM” was used in all experiments used to determine the results. This research evaluates the effectiveness of the suggested algorithm against 68 standard benchmark functions. There are four distinct types of benchmark functions: unimodal with fixed dimensions, multimodal with fixed dimensions, unimodal with variable dimensions, and multimodal with variable-dimensions. An algorithm’s exploitative and exploratory abilities are commonly evaluated using unimodal or multimodal functions. In this research, the proposed algorithm was evaluated using multiple benchmark functions. The functions utilized in this research are outlined in Annex A. The reference is drawn from28 for these benchmark functions. In addition, the proposed CSAMWL’s algorithm performance has been compared to ten commonly used algorithms, the Chameleon Swarm Algorithm (CSA)25, the Elephant Herding Optimization (EHO)29, the Gravitational Search Algorithm (GSA)30, the Ant Colony Optimization (ACO)31, the Earthworm Optimization Algorithm (EWA)32, the Particle Swarm Optimization (PSO)33, the Sine Cosine Algorithm (SCA)34, the Krill Herd Algorithm (KHA)35, the Artificial Bee Colony (ABC)36, and the Monarch Butterfly Optimization (MBO)37. Table 2 displays the parameters of the contrasting algorithms as they were initially specified in the aforementioned published research articles. The ‘NFEs’ column in Table 2 indicates how many times a given function was evaluated. For each benchmark function, 30 separate runs for each algorithm are executed to generate the results.
Unimodal functions performance evaluation and statistical analysis
The exploitative potential of an algorithm can be measured with the help of unimodal functions. As a result, two tests were conducted using unimodal benchmark functions as part of this research. Table 3 depicts and compares all of the results from the first experiment for unimodal fixed-dimension functions. In the second experiment, the unimodal variable-dimension functions outcomes of 10 algorithms are compared. Table 4 summarizes the findings.
Tables 3 and 4 show that the mCSAMWL algorithm yielded the optimum results for the test functions F1-F3, F5, F6 as well as F8, F9, and F11 globally. It delivered excellent results for the benchmark functions F11, F18, and F21, and F15. It displays the effective exploitation capabilities of the proposed mCSAMWL method. To demonstrate the statistical distinction between the proposed mCSAMWL and other commonly used algorithms, the Friedman mean rank test is used. The statistical findings from the test are presented graphically in Fig. 1. The proposed algorithms and additional cutting-edge ones are represented on the X-axis. The Friedman mean ranks are displayed on the Y-axis.
The graph above illustrates that the best mean rank is the one with the smallest number. As can be seen in Fig. 1, the suggested mCSAMWL algorithm outperforms other popular metaheuristic algorithms when it comes to solving unimodal functions. The proposed algorithm mCSAMWL scores in first place, followed by GSA and CSA in second and third place, respectively. The results demonstrate that the mCSAMWL algorithm outperforms conventional metaheuristic algorithms in exploitative behaviour.
Multimodal function performance evaluation and statistical result analysis
Exploratory behaviour in algorithms is measured with the help of multimodal functions. Within the scope of this research, we conducted two experiments on multimodal benchmark functions. Table 5 shows the first experiment’s results, which compare the performance of 10 commonly used metaheuristic algorithms against 27 multimodal fixed-dimension functions. The performance of 10 commonly used metaheuristic algorithms to the results of 17 multimodal variable-dimension functions is shown in Table 6.
After evaluating the statistics, we concluded that the proposed mCSAMWL algorithm can find global optimal solutions for the benchmark functions, F26-F35, F38-F39, F42, F43, F48-F50and comparable result for function F40 which are shown in Tables 5 and 6. The proposed mCSAMWL algorithm also outperforms the F46, F54, F61, F63 and gives competitive results for F50, F58, F67 function when compared to alternative algorithms. It demonstrates how the suggested mCSAMWL algorithm efficiently explores it’s given search space. The Friedman’s mean rank test demonstrates the statistical distinction between mCSAMWL and other commonly used algorithms. The graphical representation of Friedman’s mean rank outcomes from the test can be seen in Fig. 2.
In Fig. 2, the performance of the proposed mCSAMWL algorithm is superior to existing, widely used metaheuristic algorithms for multimodal benchmark functions. In light of the findings, the mCSAMWL achieved a top ranking, followed by the CSA and the GSA in the stipulated order. It shows that the suggested mCSAMWL algorithm has statistically superior exploratory behaviour compared to the other commonly used metaheuristic algorithms. It can be seen in Fig. 2, the proposed mCSAMWL algorithm outshines when applied to multimodal functions compared to the other commonly used metaheuristic algorithms. The findings showed that mCSAMWL was the winner, with CSA and GSA coming in second and third place, respectively. Finally, it can be concluded that the proposed mCSAMWL algorithm performs better in exploratory behavior than other popular metaheuristic algorithms. Friedman’s mean rank test is performed for the complete statistical evaluation of unimodal and multimodal benchmark functions. Figure 3 depicts the outcomes of a Friedman mean rank test. Friedman’s mean rank test results reveal that the proposed mCSAMWL algorithm is the finest among other algorithms, followed by the CSA and the GSA algorithms. Finally, the proposed mCSAMWL algorithm has proven its perfection and significant potential for handling a wide range of optimization challenges across various situations.
Comparison of proposed mCSAMWL algorithm with other algorithms on CEC2017 benchmark functions
Table 7 presents a comparative analysis of ten optimization algorithms evaluated on the CEC 2017 benchmark functions (F69-F97). Performance is assessed using mean and standard deviation. The proposed algorithm outperformed the Chameleon Search Algorithm (CSA) specifically on benchmark functions F82, F84, F88, F95, and F97, while maintaining comparable performance levels across all other test functions. For other optimization algorithms, the proposed algorithm, mCSAMWL, achieved the lowest mean value for function F70 and also demonstrated superior performance by obtaining the lowest mean values among functions F75, F79-F82, and F84-F86. Furthermore, mCSAMWL exhibited relatively low standard deviations for functions F69-F75, indicating good stability. Notably, mCSAMWL significantly outperformed other algorithms on functions F79-F85, demonstrating exceptional stability in this range. While both mCSAMWL and ABC frequently achieved the lowest mean values for functions F88-F92, mCSAMWL maintained competitive performance across the remaining functions F86-F97, consistently yielding results close to the optimal values, even when not achieving the absolute lowest mean.
In contrast, EWA and EHO generally exhibited higher variability and less optimal performance compared to the other evaluated algorithms. Overall, mCSAMWL demonstrated strong performance, with particular excellence observed in functions F79-F85. This suggests balanced optimization capabilities, effectively combining exploration and exploitation, as evidenced by its consistent performance across diverse function types. The low standard deviations associated with mCSAMWL further underscore its reliable and stable performance, positioning it as a robust choice for a variety of optimization problems.
Ablation study of the proposed mCSAMWL algorithm
To overcome the defects in the original algorithm, this paper proposes a modified version of Chameleon Search Algorithm. First, the exploration phase of CSA is modified using Morlet Wavelet Mutation to achieve better convergence performance. Then, we introduce the Lévy Flight distribution with step reducer feature in the exploitation part to help search agents escape from the local optima. To evaluate the effectiveness of each component, two mCSAMWL-derived variants are designed individually for comparison study in this subsection, which are listed below:
-
CSAMW (modification of CSA with Morlet Wavelet Mutation only).
-
CSALF (modification of CSA with Lévy Flight distribution only).
-
Proposed mCSAMWL (modified CSA using Morlet Wavelet mutation and Lévy Flight Distribution).
Under the same experimental setting original CSA, CSAMW, CSALF, and mCSAMWL are tested on 23 different types of benchmark functions concurrently. The obtained median, mean fitness (mean) and standard deviation (Std) results are listed in Table 8.
A preliminary analysis on the simple functions F1-F3 reveals comparable performance across all methods, with the proposed method and CSA achieving near-optimal results. CSAMW and CSALF exhibit marginally weaker performance on F1. While CSA and CSALF demonstrate advantages on specific functions (F4 and F7 for CSA; F4 for CSALF), the proposed method maintains competitive and comparable or equivalent performance across the majority of other functions.
The proposed algorithm demonstrates a distinct advantage on functions F10-F15, significantly outperforming CSA and CSAMW across most of this range (F10-F14) and achieving superior results on F15. Furthermore, it exhibits improved performance compared to CSALF on F11-F14. Regarding functions F16-F24, the proposed algorithm continues to perform strongly, exhibiting substantial improvements over CSA and generally achieving superior results compared to CSAMW. The comparison with CSALF is more complex, with CSALF demonstrating better performance on F22 and F24; however, the proposed algorithm demonstrates greater overall consistency across this function set. Overall, the proposed algorithm performs well, particularly on more complex functions, demonstrating significant improvements over CSA and frequently outperforming CSAMW.
The Proposed algorithm demonstrates superior overall performance, especially in complex functions, while maintaining good stability. CSAMW shows excellent performance in specific cases but lacks consistency. CSALF excels in simpler functions and maintains good precision but may struggle with more complex optimization problems. Therefore, it can be concluded that the modifications implemented have a demonstrable and positive impact on the algorithm’s capabilities.
Real-world engineering design problems
In this section, the proposed mCSAMWL algorithm is tested on three real world engineering design problems: the design of welded beams, tension/compression springs, the pressure vessel problem, and their performances are evaluated. In the real world, meta-heuristic algorithms are frequently used to solve engineering design problems. These engineering design problems from the actual world may involve up to 15,000 function evaluations. The parameter values are identical to those in Table 6. Thirty independent runs were conducted to determine the best, average, standard deviation, and worst outcomes. The MATLAB platform was used to evaluate the proposed mCSAMWL algorithm results, while the other algorithm’s findings were obtained from the main research publications.
Welded beam design
It serves as a crucial benchmark for evaluating various optimization techniques. This problem aims to bring down the costs of setting up, welder jobs, and material expenses associated with constructing the welded beam. Shear stress, bending stress, buckling load, end deflection, and side constraints are among the property constraints. The design variables include the length of the welded part (\(\:l\)), the thickness of the welding (\(\:h\)), width \(\:(b\)) and height (\(\:t\)). The problem’s mathematical representation can be expressed in the following Eqs. (16–25).
Consider
Minimize
Subject to
where,
Range of design variables
The mCSAMWL algorithm optimizes the welding beam problem’s parameters, and the outcome is 1.69702. It is evident from Tables 9 and 10 that the mCSAMWL algorithm has generated the best solution, which is superior to other algorithms except the ACO algorithm. In conclusion, the proposed algorithm is reliable for getting good results for the welded beam design problem.
Tension/compression spring design
Tension/compression spring design is another popular mechanical problem. The weight of the spring needs to be reduced as much as possible to accomplish the goal of this problem’s. It can be achieved by managing the coil mean diameter \(\:D,\) the wire diameter \(\:d\), and the active coil count \(\:N\). When designing a compression spring, the predetermined restrictions on shear stress, minimum deflection and surge frequency must be adhered to, all while maintaining as little weight as is feasible. The following is the way to define an individual of the goal variables:
Consider
Minimize
Subject to
Design variables range
The tension/compression spring design results from mCSAMWL and other algorithms are shown in Table 11. The results illustrate that the mCSAMWL algorithm works better than the other state-of-the-art algorithms. Table 12 shows that the suggested mCSAMWL algorithm for the tension/compression spring design problem got similar results after a very small number of function evaluations.
Pressure vessel design
By optimizing four variables, the pressure vessel design problem’s objective is to reduce the cost of fabrication by satisfying four constraints. The design variables consist of the thickness of the head (\(\:{T}_{h}\)), the length of the section without a head (\(\:L\)), thickness of the shell (\(\:{T}_{s}\)), and the inner radius (\(\:R\)). The problem can be modeled mathematically as in following Eqs. (33–39).
Consider
Minimize
Subject to
Design variables range
Table 13 presents a summary of the best results that the mCSAMWL algorithm and the other commonly used metaheuristic algorithms delivered when solving the problem of designing pressure vessels. Table 13 illustrates that the proposed mCSAMWL algorithm delivered superior outcomes compared to the rest of the algorithms. Table 14 illustrates the results of a statistical analysis carried out on the algorithms used to address the pressure vessel design problem. In summary, the mCSAMWL algorithm delivered the most accurate solutions to the pressure vessel design problem while exhibiting the smallest degree of standard deviation.
Application using mCSAMWL algorithm for balance clustering in WSN
Let \(\:T=\left\{{t}_{1},{t}_{2},\ldots {t}_{z}\right\}\) be the set of z sensors in the region of interest. by \(\:{t}_{z}=\left({x}_{i},{y}_{i}\right)\in\:{R}^{2}\) indicates the location of the sensor \(\:{T}_{z}\). In order to save energy, sensor nodes are organised into clusters38, with one node in each cluster acting as the cluster head. It is the job of the cluster head to gather data from the other nodes in the cluster and send it on to the base station\(\:bstCL=\left\{{cl}_{1},{cl}_{2},\ldots\:.{cl}_{y}\right\}\subset\:S\). denotes the subset of sensors chosen as cluster heads38. Sensor \(\:{t}_{i}\) i is a member of cluster\(\:cl\left({t}_{i}\right)\) ,which is given by Eq. (40).
where indicates the Euclidean distance between sensors \(\:{t}_{i}\)and \(\:{cl}_{j}\). For a cluster, the sensor subset \(\:{Q}_{j},\) Cluster head \(\:{h}_{j}\) is defined as Eq. (41).
This research optimizes the three most recurrent functions. First, the objective function is the average intra cluster distance. If the sensor nodes and cluster head are closer, less energy is needed to transmit data between them. It is represented in Eq. (42).
where, \(\:{D}_{{s}_{i}}^{{ch}^{k}}\) denotes the Euclidean distance between sensor nodes and cluster heads
Balance cluster formation is the second objective function to consider, as it depends on the node degree, i.e. number of nodes that are associated with the CHs. It can be achieved by considering the average distance between the CHs. The distance between cluster heads should be maximum to attain disbursement of the clusters throughout the network. It is represented below in Eq. (43).
where, \(\:{D}_{{ch}_{j}}^{{ch}_{i}}\) denotes the distance between cluster heads.
The average cluster head-to-base station distance is the third objective function. The shorter the distance between CHs and BS, the more likely it is that a node closer to BS will be chosen as a CH because it will take less energy to send all the data to BS as given in Eq. (44).
where, the distance from the cluster head to the base station is denoted by \(\:dis\left({CH}_{i},BS\right),\)K represents number of cluster heads. So, \(\:{P}_{CHBS}\) is expressed as in Eq. (45).
where, \(\:{\phi\:}_{1},{\phi\:}_{12}\wedge\:{\phi\:}_{3}\) are the weighted coefficients such that, \(\:{\phi\:}_{1}+{\phi\:}_{2}{+\phi\:}_{3}=1\)
Results and discussion
In this research, we present an efficient method for selecting CHs using the modified Chameleon Swarm Optimization Algorithm (mCSAMWL) and a fitness function that considers average intra-cluster distance, average inter-cluster distance, and average cluster head to base station distance. Based on the area the WSN network covers, the algorithm comprises three distinct groups: WSN ~ 1 for 100 × 100, WSN ~ 2 for 200 × 200 m, and WSN ~ 3 for 300 × 300 m. Each of these groups corresponds to a different size by varying the number of sensor nodes. G#1 consists of 100 nodes, G#2 consists of 200 nodes, and G#3 consists of 300 nodes, respectively. The parameter setting for the WSN network is shown in Table 15. In this section, the performance of six commonly used techniques, the Atom Search Optimization (ASO )39, the Hybrid Particle Swarm Optimization and Grey Wolf Optimization (PSO-GWO)40, the African Vulture Optimization Algorithm (AVOA)41, the Bald Eagle Search Algorithm (BES)42,43, and the Chameleon Swarm Algorithm (CSA)25, are assessed in light of simulation parameters, average energy consumption, residual energy of the network, total energy consumption, dead nodes, and cluster head frequency. The simulation was run 20,000 times to determine which nodes were alive and which were deceased, while 1,000 rounds of execution were performed to assess the metrics for the performance of the algorithms mentioned above.
Simulation parameters
The effectiveness of the proposed clustering technique has been determined using the simulated parameters listed below:
-
(a)
Average Energy Consumption: It determines the mean gap between every sensor node’s starting and ending energy levels. To put it another way, it’s the amount of energy that each node in a WSN network uses every round to send and receive packets of data.
-
(b)
Total Energy Consumption: The energy dissipation of a network over a single round is the total amount of power consumed by the network’s nodes during that round.
-
(c)
Total Residual Energy: The total amount of residual energy is equal to the sum of the energies currently present in each sensor node.
-
(d)
Dead Node: It is defined as the number of nodes that died over time during the simulation.
-
(e)
CH Frequency: The frequency at which the sensor nodes performed the duties of CH during a certain time frame. High frequency suggests a sensor node is regularly selected as a CH, while low frequency means it is not.
Average energy consumption evaluation
Table 16 illustrates the average energy consumption performance of metaheuristic algorithms for a range of network sizes as well as node densities. Clustering techniques using the mCSAMWL algorithm consume less energy on average than other algorithms. Table 14 clearly shows the clustering technique using the mCSAMWL algorithm has the lowest average energy consumption in all network scenarios. The performance of different algorithms based on average energy consumption is depicted in Figs. 4 and 5. This technique outperforms its rivals’ algorithms in each of the nine scenarios. The PSO-GWO algorithm performs the worst, followed by the ASO algorithm. The standard CSA algorithm has also demonstrated subpar performance compared to the mCSAMWL-based clustering method.
For WSN ~ 1, the average energy consumption of clustering techniques based on the mCSAMWL algorithm is 0.0387, 0.0351, and 0.0336 joules which is 1.77%, 3.83% and 4.81% lesser than CSA technique. Also, the respective values for WSN ~ 2 are 0.0738, 0.0594, and 0.0539 joules which are 50.1%, 59.8% and 19.7% better than CSA technique. Moreover, with values of 0.1340, 0.1015, and 0.0925 joules proposed technique consumes lesser average energy than CSA technique by 31.9%, 18.7%, and 52.2% respectively. These results reveal minimal energy consumption variance across all the mCSAMWL algorithm-based clustering scenarios. It is not the case for ASO39, PSO-GWO AVOA41, BES42,43, CSA25 techniques.
Total energy consumption evaluation
Figures 6, 7, 8, 9 and 10 depicts the total energy consumption of the clustering technique using the mCSAMWL algorithm in comparison to the remaining techniques ASO39, PSO-GWO AVOA41, BES42,43, CSA25 for 1000 iterations. Based on Table 17, PSO-GWO and ASO are the worst performers in terms of total energy consumption, followed by the BES technique. In all the scenarios considered, the mCSAMWL algorithm for the clustering technique performs optimally, followed by the CSA and AVOA techniques. Compared to the mCSAMWL algorithm-based clustering technique, the total amount of energy consumed by other techniques is significantly higher. The faster the sensor node depletes its energy, the sooner the WSN network will collapse. Whereas, in the case of the mCSAMWL algorithm-based clustering technique, the WSN network will last longer due to its low energy consumption compared to the other techniques.
In the WSN ~ 1 scenario, the total energy consumption of the clustering technique based on the mCSAMWL algorithm is 7.7373, 6.9473, and 6.6526 joules which is lesser than 1.62% in G#1, 6.25% in G#3 scenario of CSA technique but in case of G#2 scenario, it is 5.64% higher as compared to CSA technique. For the WSN ~ 2 scenario, the respective values are 14.7756, 12.0804, and 11.6942 joules which are 50.7%, 15.7% and 16.2% better than CSA technique. Furthermore, with values of 26.8223, 20.4252, and 17.6456 joules, mCSAMWL gives more efficient results in WSN ~ 3 than CSA technique by 44.3%, 31.9%, and 1%. Compared with the other techniques, the clustering technique based on the mCSAMWL algorithm demonstrated the least variation in energy consumption across all scenarios.
Total residual energy
Table 18 depicts residual network energy after 1000 iterations for various scenarios. The mCSAMWL algorithm-based clustering technique outperforms the others in terms of residual energy, as shown in Table 18 The more energy that remains in the network, the longer it will last. Table 18 shows that the mCSAMWL algorithm-based clustering technique has the most residual energy left for all WSN scenarios.
As a result, the lifespan of the WSN network is prolonged. The PSO-GWO technique is the worst performer, followed by the ASO technique. The AVOA technique excelled in BES, ASO, and PSO-GWO, but it was unable to compete with the CSA and mCSAMWL algorithm-based clustering techniques.
Dead nodes
Table 19 presents the details of dead nodes in six algorithms based on the number of rounds. It represents the number of nodes that have some amount of energy that correlates with the number of rounds that have been completed. At the end of 4000 rounds, the mCSAMWL algorithm-based clustering method had no dead nodes, while other techniques did. ASO has the highest number of dead nodes, followed by PSO-GWO, BES, and CSA. It has been noted that the node lifespan has been extended in the clustering technique employing the mCSAMWL algorithm when compared to the ASO39, PSO-GWO AVOA41, BES42,43, CSA25 techniques.
Similarly, after 18,000 rounds, the CSA technique has 56 dead nodes, followed by AVOA, PSO-GWO, CSA, BES, and ASO. With only 30 dead nodes after 18,000 iterations, the mCSAMWL algorithm-based clustering technique clearly stands well ahead of other techniques.
Cluster head frequency
To ensure that each sensor node draws on a comparable amount of energy, the cluster head’s responsibility must be equally divided among sensor nodes. The frequency with which a node in a given network size and density becomes the cluster head throughout the duration of the first 1000 simulation iterations is depicted in Figs. 11, 12, 13, 14, 15, 16, 17 and 18, and 19. It is clear that the behavior of many techniques changes as network size or density changes. The mCSAMWL-based clustering technique has demonstrated remarkable consistency in selecting sensor nodes to serve as cluster heads. It has been accomplished by distributing the responsibility of cluster head throughout the WSN network and maintaining small oscillations around the average cluster head frequency for all network densities. PSO-GWO, ASO, and BES techniques are the worst performers in terms of cluster head frequency parameters.
The proposed modified metaheuristic mCSAMWL algorithm applies Morlet wavelet mutation and Lévy Flight distribution as a different approach to solving optimization challenges. These modifications have made the standard CSA algorithm more effective and assisted in achieving a better equilibrium between the exploitation and exploration phases. The proposed mCSAMWL algorithm’s performance has been assessed using 97 benchmark functions and three real-world engineering design problems. Based on the encouraging outcomes, the proposed mCSAMWL method has been implemented for clustering in WSN. The clustering technique using the proposed mCSAMWL algorithm excels over the original CSA and other clustering techniques in terms of average energy consumption, residual energy of the network, total energy consumption, dead nodes, and cluster head frequency. This technique performs extremely well in all network scenarios with variable node densities. The incorporation of Morlet wavelet and Lévy Flight into the existing standard CSA algorithm has improved the capabilities of the original CSA Algorithm.
Conclusions and recommendations
Metaheuristic algorithms have gained popularity as a fast and effective way to solve optimization problems. To overcome the limitations of the existing studies in this area, this work proposes, develops, and applies a modified, better performing Chameleon Swarm Algorithm incorporating Morlet wavelet and Lévy Flight distribution to enhance the efficacy of the standard CSA algorithm. The Morlet wavelet mutation is used to enhance the exploration phase of the CSA algorithm by exploring the entire search space and dividing it into two distinct regions. To improve the exploitation phase, the Lévy Flight distribution strategy with a step reducer factor is added to the normal CSA algorithm. So, the proposed algorithm applies changes to achieve an appropriate equilibrium between the exploration and exploitation phases. The proposed algorithm’s efficacy is tested on 68 unimodal and multimodal benchmark functions and CEC 2017 test suite functions, and results are compared with 10 commonly used metaheuristic algorithms. The proposed mCSAMWL algorithm obtains the lowest Friedman mean rank, demonstrating its superiority over the other state-of-the-art algorithms.
Furthermore, the proposed algorithm has been used to effectively address three real-world engineering design problems. Finally, the proposed mCSAMWL algorithm has been applied for clustering in WSN to find the optimal cluster head set and balance out the clustering process. The fitness function for this clustering technique uses average intra-cluster distance, average inter-cluster distance, and distance between cluster heads and the base station. This clustering technique has been thoroughly tested with three different WSN scenarios under varying node densities. The simulation performance of this technique has been computed against six commonly used metaheuristic techniques. From the experimental results, the clustering technique using the mCSAMWL algorithm outperforms the other technique in terms of average energy consumption, total energy consumption, residual energy, dead nodes and cluster head frequency. Significantly, the clustering technique using the mCSAMWL algorithm has resulted in increasing the lifetime of the WSN network by balancing out the cluster formation process and the average energy consumption of the sensor nodes. Further, the proposed improved algorithm can have applications to address various clustering, medical imaging, image segmentation, engineering design, data forecasting, classification, feature selection, and other real-world problems. As a future work, a variant of the mCSAMWL is being worked on to handle multi-objectives problems.
Data availability
The benchmark functions used in this research are publicly available and can be accessed from the CEC-BC-2017 dataset, referenced in28,29. https://www.kaggle.com/code/kooaslansefat/cec-2017-benchmark.
References
Iwendi, C. et al. A metaheuristic optimization approach for energy efficiency in the IoT networks. Software: Pract. Experience. 51(12), 2558–2571 (2021).
Vennila, H. et al. Static and dynamic environmental economic dispatch using tournament selection based ant Lion optimization algorithm. Front. Energy Res. 10, 972069. https://doi.org/10.3389/fenrg.2022.972069 (2022).
Wang, C. A distributed particle-swarm-optimization-based fuzzy clustering protocol for wireless sensor networks. Sensors 23(15), 6699. https://doi.org/10.3390/s23156699 (2023).
Adegboye, O. R. et al. Refinement of dynamic hunting leadership algorithm for enhanced numerical optimization. IEEE Access. https://doi.org/10.1109/access.2024.3427812 (2024).
Adegboye, O. R. et al. DGS-SCSO: enhancing sand Cat swarm optimization with dynamic pinhole imaging and golden sine algorithm for improved numerical optimization performance. Sci. Rep. 14(1), 1491. https://doi.org/10.1038/s41598-023-50910-x (2024).
Adegboye, O. R. & Deniz Ülker, E. Hybrid artificial electric field employing cuckoo search algorithm with refraction learning for engineering optimization problems. Sci. Rep. 13(1), 4098. https://doi.org/10.1038/s41598-023-31081-1 (2023).
Sridharan, A. Chameleon swarm optimisation with machine learning based sentiment analysis on sarcasmdetection and classification model. Int. Res. J. Eng. Technol. 8(10), 821–828 (2021).
Umamageswari, A., Bharathiraja, N. & Irene, D. S. A novel fuzzy C-means based chameleon swarm algorithm for segmentation and progressive neural architecture search for plant disease classification. ICT Express. https://doi.org/10.1016/j.icte.2021.08.019 (2021).
Rizk-Allah, R. M., El‐Hameed, M. A. & El‐Fergany, A. A. Model parameters extraction of solid oxide fuel cells based on semi‐empirical and memory‐based chameleon swarm algorithm. Int. J. Energy Res. 45(15), 21435–21450. https://doi.org/10.1002/er.7192 (2021).
Anitha, S., Saravanan, S. & Chandrasekar, A. A modified Gray wolf-based chameleon swarm algorithm for minimizing energy consumption and enabling secure communication in wireless sensor network. Concurrency Computation: Pract. Experience. 34(26), e7295. https://doi.org/10.1002/cpe.7295 (2022).
Rizk-Allah, R. M., Hassanien, A. E. & Snášel, V. A hybrid chameleon swarm algorithm with superiority of feasible solutions for optimal combined heat and power economic dispatch problem. Energy 254, 124340. https://doi.org/10.1016/j.energy.2022.124340 (2022).
Mostafa, R. R., Ewees, A. A., Ghoniem, R. M., Abualigah, L. & Hashim, F. A. Boosting chameleon swarm algorithm with consumption AEO operator for global optimization and feature selection. Knowl. Based Syst. 246, 108743. https://doi.org/10.1016/j.knosys.2022.108743 (2022).
Wang, J., Lv, M., Li, Z. & Zeng, B. Multivariate selection-combination short-term wind speed forecasting system based on convolution-recurrent network and multi-objective chameleon swarm algorithm. Expert Syst. Appl. 214, 119129. https://doi.org/10.1016/j.eswa.2022.119129 (2023).
Hu, G., Yang, R., Qin, X. & Wei, G. MCSA: Multi-strategy boosted chameleon-inspired optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 403, 115676. https://doi.org/10.1016/j.cma.2022.115676 (2023).
Damin, Z., Yi, W. & Linna, Z. Chameleon swarm algorithm for segmental variation learning of population and S-type weight. J. Syst. Simul. 35(1), 11. https://doi.org/10.16182/j.issn1004731x.joss.21-0968 (2023).
Braik, M. S., Awadallah, M. A., Al-Betar, M. A., Hammouri, A. I. & Zitar, R. A. A non-convex economic load dispatch problem using chameleon swarm algorithm with roulette wheel and levy flight methods. Appl. Intell. 1–40. https://doi.org/10.1007/s10489-022-04363-w (2023).
Hu, G., Yang, R. & Wei, G. Hybrid chameleon swarm algorithm with multi-strategy: A case study of degree reduction for disk Wang–Ball curves. Math. Comput. Simul. 206, 709–769. https://doi.org/10.1016/j.matcom.2022.12.001 (2023).
Sun, Y. et al. Research on signal detection of OFDM systems based on the LSTM network optimized by the improved chameleon swarm algorithm. Mathematics 11(9), 1989. https://doi.org/10.3390/math11091989 (2023).
Zhou, J. & Xu, Z. Optimal sizing design and integrated cost-benefit assessment of stand-alone microgrid system with different energy storage employing chameleon swarm algorithm: A rural case in Northeast China. Renew. Energy. 202, 1110–1137 (2023).
Dinh, P-H. Medical image fusion based on enhanced three-layer image decomposition and chameleon swarm algorithm. Biomed. Signal Process. Control. 84, 104740 (2023).
Abed-alguni, B. H. Island-based cuckoo search with highly disruptive polynomial mutation. Int. J. Artif. Intell. 17(1), 57–82. https://doi.org/10.1007/s13369-020-05141-x (2019).
Abed-alguni, B. H. & Paul, D. Island-based cuckoo search with elite opposition-based learning and multiple mutation methods for solving discrete and continuous optimization problems. https://doi.org/10.1007/s00500-021-06665-6 (2021).
Abed-alguni, B. H., Alawad, N. A., Barhoush, M. & Hammad, R. Exploratory cuckoo search for solving single-objective optimization problems. Soft. Comput. 25(15), 10167–10180. https://doi.org/10.1007/s00500-021-05939-3 (2021).
Abed-Alguni, B. H., Paul, D. & Hammad, R. Improved salp swarm algorithm for solving single-objective continuous optimization problems. Appl. Intell. 52(15), 17217–17236. https://doi.org/10.1007/s10489-022-03269-x (2022).
Braik, M. S. Chameleon swarm algorithm: A bio-inspired optimizer for solving engineering design problems. Expert Syst. Appl. 174, 114685. https://doi.org/10.1016/j.eswa.2021.114685 (2021).
Gao, Y., Zhang, H., Duan, Y. & Zhang, H. A novel hybrid PSO based on levy flight and wavelet mutation for global optimization. Plos One. 18(1), e0279572. https://doi.org/10.1371/journal.pone.0279572 (2023).
Jagadeesh, S. & Muthulakshmi, I. Dynamic clustering and routing using multi-objective particle swarm optimization with levy distribution for wireless sensor networks. Int. J. Commun Syst. 34(13). https://doi.org/10.1002/dac.4902 (2021).
Isiet, M. & Gadala, M. Sensitivity analysis of control parameters in particle swarm optimization. J. Comput. Sci. 41, 101086 (2020).
Singh, H., Singh, B. & Kaur, M. An improved elephant herding optimization for global optimization problems. Eng. Comput. 1–33. https://doi.org/10.1007/s00366-021-01471-y (2021).
Mittal, H., Tripathi, A., Pandey, A. C. & Pal, R. Gravitational search algorithm: a comprehensive analysis of recent variants. Multimedia Tools Appl. 80, 7581–7608. https://doi.org/10.1007/s11042-020-09831-4 (2021).
Fidanova, S. & Fidanova, S. Ant colony optimization. Ant Colony Optim. Appl. 3–8. https://doi.org/10.1007/978-3-030-67380-2_2 (2021).
Pasupuleti, V. & Balaswamy, C. Performance analysis of fractional earthworm optimization algorithm for optimal routing in wireless sensor networks. EAI Endorsed Trans. Scalable Inform. Syst. 8, 32. https://doi.org/10.4108/eai.21-4-2021.169419 (2021).
Tian, Y. & Chang, Y. Application of the particle swarm optimization algorithm-back propagation neural network algorithm introducing new parameter terms in the application field of industrial design. Results Eng. https://doi.org/10.1016/j.rineng.2023.101728 (2023).
Gabis, A. B., Meraihi, Y., Mirjalili, S. & Ramdane-Cherif, A. A comprehensive survey of sine cosine algorithm: variants and applications. Artif. Intell. Rev. 54(7), 5469–5540. https://doi.org/10.1007/s10462-021-10026-y (2021).
Sadrishojaei, M., Navimipour, N. J., Reshadi, M. & Hosseinzadeh, M. A new clustering-based routing method in the mobile internet of things using a Krill herd algorithm. Cluster Comput. 1–11 https://doi.org/10.1007/s10586-021-03394-1 (2022).
Kaya, E., Gorkemli, B., Akay, B. & Karaboga, D. A review on the studies employing artificial bee colony algorithm to solve combinatorial optimization problems. Eng. Appl. Artif. Intell. 115, 105311. https://doi.org/10.1016/j.engappai.2022.105311 (2022).
Ghetas, M. Learning-based monarch butterfly optimization algorithm for solving numerical optimization problems. Neural Comput. Appl. 1–19. https://doi.org/10.1007/s00521-021-06654-8 (2022).
Wohwe Sambo, D., Yenke, B. O., Förster, A. & Dayang, P. Optimized clustering algorithms for large wireless sensor networks: A review. Sensors 19(2), 322. https://doi.org/10.3390/s19020322 (2019).
Bi, J. & Zhang, Y. An improved atom search optimization for optimization tasks. Multimedia Tools Appl. 82(5), 6375–6429. https://doi.org/10.1007/s11042-022-13171-w (2023).
Gul, F. et al. Meta-heuristic approach for solving multi-objective path planning for autonomous guided robot using PSO–GWO optimization algorithm with evolutionary programming. J. Ambient Intell. Humaniz. Comput. 12, 7873–7890. https://doi.org/10.1007/s12652-020-02514-w (2021).
Abdollahzadeh, B., Gharehchopogh, F. S. & Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 158, 107408. https://doi.org/10.1016/j.cie.2021.107408 (2021).
Kapileswar, N. & Phani Kumar, P. Energy efficient routing in IOT based UWSN using bald eagle search algorithm. Trans. Emerg. Telecommunications Technol. 33(1), e4399. https://doi.org/10.1002/ett.4399 (2022).
Kusla, V. & Brar, G. S. MOBES: a modified bald eagle search based technique for optimal clustering in wireless sensor network. Univ. Politehnica Buchar. Sci. Bull. Ser. C 86(1), 207–224 (2024).
Acknowledgements
The authors would like to acknowledge Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2025R197), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia. The authors would like to thank the Automated Systems and Soft Computing Lab (ASSCL) at Prince Sultan University, Riyadh, Saudi Arabia, for their support to this work.
Funding
The authors would like to acknowledge Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2025R197), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.
Author information
Authors and Affiliations
Contributions
All authors reviewed the manuscript.V. Kusla: Conceptualization, Methodology, Formal analysis, Validation, Writing-original draft. GS. Brar: Formal analysis, Validation. Harpreet Kaur: Methodology, Formal analysis, Investigation, Data curation, Writing-review and editing. Ramandeep Sandhu: Conceptualization, Resources, Visualization, Supervision. Chander Prabha: Methodology, Validation. Md Rittique Alam: Writing-review and editing, Md. Mehedi Hassan: Supervision, Investigation, Writing-review and editing. Shahab Abdullah: Writing-review and editing. Samah Alshathri: Conceptualization, Resources, Funding, Supervision. Walid El-Shafai: Methodology, Validation, Visualization.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix 1
Appendix 1
In the following table, f. no. represents the function number, function name defines the name of the function, dim represents the number of dimensions (design variables) of the function, range defines the lower and upper bound of search space for the function, global value defines the global optimum value of the function.
F. no. | Function name | Dim | Range | Global value |
---|---|---|---|---|
Unimodal functions with fixed dimension | ||||
F1 | Beale | 2 | [-4.5, 4.5] | 0 |
F2 | Booth | 2 | [-10, 10] | 0 |
F3 | Brent | 2 | [-10, 10] | 0 |
F4 | Matyas | 2 | [-10, 10] | 0 |
F5 | Schaffer N. 4 | 2 | [-100, 100] | 0.292579 |
F6 | Wayburn Seader 3 | 2 | [-500, 500] | 19.10588 |
F7 | Leon | 2 | [-1.2, 1.2] | 0 |
F8 | Cube | 2 | [-10, 10] | 0 |
F9 | Zettl | 2 | [-5, 10] | -0.00379 |
Unimodal functions with variable dimensions | ||||
F10 | Sphere | 30 | [-100, 100] | 0 |
F11 | Power Sum | 30 | [-1, 1] | 0 |
F12 | Schwefel’s 2.20 | 30 | [-100, 100] | 0 |
F13 | Schwefel’s 2.21 | 30 | [-100, 100] | 0 |
F14 | Step | 30 | [-100, 100] | 0 |
F15 | Stepint | 30 | [-5.12, 5.12] | -155 |
F16 | Schwefel’s 2.22 | 30 | [-100, 100] | 0 |
F17 | Schwefel’s 2.23 | 30 | [-10, 10] | 0 |
F18 | Rosenbrock | 30 | [-30, 30] | 0 |
F19 | Brown | 30 | [-1, 4] | 0 |
F20 | Dixon and Price | 30 | [-10, 10] | 0 |
F21 | Powell Singular | 30 | [-4, 5] | 0 |
F22 | Xin-She Yang | 30 | [-20, 20] | -1 |
F23 | Perm 0, D, Beta | 5 | [-Dim, Dim] | 0 |
F24 | Sum Suqares | 30 | [-10, 10] | 0 |
Multimodal functions with fixed- dimension | ||||
F25 | Egg Crate | 2 | [-5, 5] | 0 |
F26 | Ackley N.3 | 2 | [-32, 32] | -195.629 |
F27 | Adjiman | 2 | [-1, 2] | -2.02181 |
F28 | Bird | 2 | [-2\(\:\pi\:\), 2\(\:\pi\:\)] | -106.765 |
F29 | Camel 6 Hump | 2 | [-5, 5] | -1.0316 |
F30 | Branin RCOS | 2 | [-5, 5] | 0.397887 |
F31 | Goldstien Price | 2 | [-2, 2] | 3 |
F32 | Hartman 3 | 3 | [0, 1] | -3.86278 |
F33 | Hartman 6 | 6 | [0, 1] | -3.32236 |
F34 | Cross-in-tray | 2 | [-10, 10] | -2.06261 |
F35 | Bartels Conn | 2 | [-500, 500] | 1 |
F36 | Bukin 6 | 2 | [(-15, -5), (-5, -3)] | 180.3276 |
F37 | Carrom Table | 2 | [-10, 10] | -24.1568 |
F38 | Chichinadze | 2 | [-30, 30] | -43.3159 |
F39 | Cross function | 2 | [-10, 10] | 0 |
F40 | Cross leg table | 2 | [-10, 10] | -1 |
F41 | Crowned Cross | 2 | [-10, 10] | 0.0001 |
F42 | Easom | 2 | [-100, 100] | -1 |
F43 | Giunta | 2 | [-1, 1] | 0.060447 |
F44 | Helical Valley | 3 | [-10, 10] | 0 |
F45 | Himmelblau | 2 | [-5, 5] | 0 |
F46 | Holder | 2 | [-10, 10] | -19.2085 |
F47 | Pen Holder | 2 | [-11, 11] | -0.96354 |
F48 | Test Tube Holder | 2 | [-10, 10] | -10.8723 |
F49 | Shubert | 2 | [-10, 10] | -186.731 |
F50 | Shekel | 4 | [0, 10] | -10.5364 |
F51 | Three-Hump Camel | 2 | [-5, 5] | 0 |
Multimodal function with variable dimension | ||||
F52 | Schwefel’s 2.26 | 30 | [-500, 500] | -418.983 |
F53 | Rastrigin | 30 | [-5.12, 5.12] | 0 |
F54 | Periodic | 30 | [-10, 10] | 0.9 |
F55 | Qing | 30 | [-500, 500] | 0 |
F56 | Alpine N. 1 | 30 | [-10, 10] | 0 |
F57 | Xin-She Yang | 30 | [-5, 5] | 0 |
F58 | Ackley | 30 | [-32, 32] | 0 |
F59 | Trignometric 2 | 30 | [-500, 500] | 0 |
F60 | Salomon | 30 | [-100, 100] | 0 |
F61 | Styblinski-Tang | 30 | [-5, 5] | -1174.98 |
F62 | Griewank | 30 | [-100, 100] | 0 |
F63 | Xin-She Yang N. 4 | 30 | [-10, 10] | -1 |
F64 | Xin-She Yang N. 2 | 30 | [-2\(\:\pi\:\), 2\(\:\pi\:\)] | 0 |
F65 | Gen. Penalized | 30 | [-50, 50] | 0 |
F66 | Penalized | 30 | [-50, 50] | 0 |
F67 | Michalewics | 30 | [0, \(\:\pi\:\)] | -29.6309 |
F68 | Quartic Noise | 30 | [-1.28, 1.28] | 0 |
CEC-BC-2017 Functions | ||||
F69 | Shifted and Rotated Bent Cigar Function | 10 | [-100, 100] | 100 |
F70 | Shifted and Rotated Rosenbrock Function | 10 | [-100, 100] | 300 |
F71 | Shifted and Rotated Rastrigin Function | 10 | [-100, 100] | 400 |
F72 | Shifted and Rotated Expanded Scaffer’s F6 Function | 10 | [-100, 100] | 500 |
F73 | Shifted and Rotated Lunacek Bi Rastrigin Function | 10 | [-100, 100] | 600 |
F74 | Shifted and Rotated Non-Continuous Rastrigin’s Function | 10 | [-100, 100] | 700 |
F75 | Shifted and Rotated Levy Function | 10 | [-100, 100] | 800 |
F76 | Shifted and Rotated Schwefel’s Function | 10 | [-100, 100] | 900 |
F77 | Hybrid Function 1 (N = 3) | 10 | [-100, 100] | 1000 |
F78 | Hybrid Function 2 (N = 3) | 10 | [-100, 100] | 1100 |
F79 | Hybrid Function 3 (N = 3) | 10 | [-100, 100] | 1200 |
F80 | Hybrid Function 4 (N = 4) | 10 | [-100, 100] | 1300 |
F81 | Hybrid Function 5 (N = 4) | 10 | [-100, 100] | 1400 |
F82 | Hybrid Function 6 (N = 4) | 10 | [-100, 100] | 1500 |
F83 | Hybrid Function 6 (N = 5) | 10 | [-100, 100] | 1600 |
F84 | Hybrid Function 6 (N = 5) | 10 | [-100, 100] | 1700 |
F85 | Hybrid Function 6 (N = 5) | 10 | [-100, 100] | 1800 |
F86 | Hybrid Function 6 (N = 6) | 10 | [-100, 100] | 1900 |
F87 | Composite Function 1 (N = 3) | 10 | [-100, 100] | 2000 |
F88 | Composite Function 2 (N = 3) | 10 | [-100, 100] | 2100 |
F89 | Composite Function 3 (N = 4) | 10 | [-100, 100] | 2200 |
F90 | Composite Function 4 (N = 4) | 10 | [-100, 100] | 2300 |
F91 | Composite Function 5 (N = 5) | 10 | [-100, 100] | 2400 |
F92 | Composite Function 6 (N = 5) | 10 | [-100, 100] | 2500 |
F93 | Composite Function 7 (N = 6) | 10 | [-100, 100] | 2600 |
F94 | Composite Function 8 (N = 6) | 10 | [-100, 100] | 2700 |
F95 | Composite Function 9 (N = 6) | 10 | [-100, 100] | 2800 |
F96 | Composite Function 10 (N = 3) | 10 | [-100, 100] | 2900 |
F97 | Composite Function 11 (N = 3) | 10 | [-100, 100] | 3000 |
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Kusla, V., Brar, G.S., Kaur, H. et al. Chameleon swarm algorithm with Morlet wavelet mutation for superior optimization performance. Sci Rep 15, 13971 (2025). https://doi.org/10.1038/s41598-025-97015-1
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-025-97015-1