Abstract
High-dimensional optimization has numerous potential applications in both academia and industry. It is a major challenge for optimization algorithms to generate very accurate solutions in high-dimensional search spaces. However, traditional search tools are prone to dimensional catastrophes and local optima, thus failing to provide high-precision results. To solve these problems, a novel hermit crab optimization algorithm (the HCOA) is introduced in this paper. Inspired by the group behaviour of hermit crabs, the HCOA combines the optimal search and historical path search to balance the depth and breadth searches. In the experimental section of the paper, the HCOA competes with 5 well-known metaheuristic algorithms in the CEC2017 benchmark functions, which contain 29 functions, with 23 of these ranking first. The state of work BPSO-CM is also chosen to compare with the HCOA, and the competition shows that the HCOA has a better performance in the 100-dimensional test of the CEC2017 benchmark functions. All the experimental results demonstrate that the HCOA presents highly accurate and robust results for high-dimensional optimization problems.
Similar content being viewed by others
Introduction
There are many basic laws of existence in nature, such as cooperation and competition, genetic variation, and survival of the fittest. In recent years, some scholars have been inspired by nature-based processes and applied these inspirations to the domain of computing. Genetic algorithms (GAs)1 are inspired by the process of biological evolution through selection, inheritance and mutation. Differential evolution (DE)2 is inspired by the process of cooperation and competition among individuals in a biological population. DE converges faster and more accurately in minimizing possible nonlinear and nondifferentiable continuous space function problems. The group search optimizer (GSO)3 is inspired by the process of production and consumption. GSO performs better than the other algorithms in a multimodal benchmark function with a few local minima.
Similarly, some researchers have proposed optimization algorithms inspired by animal behaviour to solve optimization problems. Kennedy4 proposed particle swarm optimization (PSO), which is inspired by the social behaviour of foraging birds, to effectively pursue the optimization of nonlinear functions in multidimensional space. To better optimize multivariable functions, Karaboga5 proposed an artificial bee colony algorithm (ABC) inspired by the honey bees social behaviour. ABC is used only to optimize 10-, 20- and 30-dimensional functions. The firefly algorithm (FA)6 utilizes the influence of light on fireflies, and the FA shows a significantly improved performance over PSO in multimodal optimization problems. The cuckoo search optimization7 mimics the parasitic behaviour of the cuckoo bird. The chicken swarm optimization (CSO)8 simulates the hierarchical structure of chicken flocks and the foraging behaviour of chickens, including roosters and hens. The dragonfly algorithm (DA)9 simulates the survival behaviour of dragonflies, including separation, parade, aggregation, predation and escape. DA lacks a width search for high-dimensional space, so it performs poorly in high-dimensional optimization problems. The DA algorithm is superior to other algorithms for optimizations in 30 dimensions. The lion optimization algorithm (LOA)10 is an algorithm inspired by a simulation of lions’ behaviours of solitude and cooperation. The LOA outperforms other optimization algorithms in only 30 dimensions of the benchmark function, and it tends to fall into a local optimum prematurely in high-dimensional problems. Inspired by the process of finding the shortest distance between ants’ food and their residence, Dorigo11 proposed the ant colony optimization algorithm (ACO). The whale optimization algorithm (WAO)12 simulates the hunting of prey, prey envelopment, and the bubble net hunting behaviour of humpback whales. There are also some nature-inspired and animal-inspired algorithms that are extensively used by researchers in various fields, such as path design13,14,15, control autoregressive models16,17,18,19 and urban development20,21.
According to the no free lunch theorem, every optimization problem cannot be solved by only one algorithm. Therefore, it is necessary to develop or improve additional metaheuristic optimization algorithms to address different types of optimization problems. A high-dimensional optimization problem is a typical representative of an optimization problem. With the continuous development of blockchain technology22,23,24,25, big data26,27 and practical nanotechnology28,29,30, the dimensionality of optimization problems is increasing dramatically. Li31 proposed a dimension dynamic sine cosine algorithm (DDSCA). In the DDSCA, the solution of each dimension is obtained first, and then the greedy algorithm is used to combine the solution of other dimensions and form a new solution. Yang32 introduced an elitist oriented particle swarm optimization algorithm (EDPSO), which uses historical information about particles to efficiently solve high-dimensional optimization problems. Chen33 designed an Efficient hierarchical surrogate assisted differential evolution (EHSDE), which balances exploration and development in a high-dimensional optimized space using a hierarchical approach. On the one hand, the above algorithms cannot effectively balance between depth searches and breadth searches in high-dimensional spaces. On the other hand, these algorithms cannot jump to the local optimum in the initial stages of the search, or are unable to search for a preferable value after jumping out of the local optimum. So, it is also crucial to develop a new optimization algorithm to solve high-dimensional optimization problems as effectively as possible.
This paper introduces a new optimization algorithm, which is named hermit crab optimization algorithm (HCOA), to solve high-dimensional optimization problems. It is inspired by the distinctive behaviour of hermit crabs in searching for, and changing to, appropriate houses to survive during their continuous growth. More specifically, the main research contributions of this paper are as follows:
-
1.
Optimal search: The hermit crabs search in the vicinity of the alpha hermit crab of the entire population. In adherence to this rule, HCAO guarantees the accuracy of the search.
-
2.
Historical path search: The hermit crabs search around the historical path of the population’s alpha hermit crabs. With this strategy, the HCOA balances between breadth and depth searches in a high-dimensional space, and helps the HCOA to jump out of the local optimum.
The remaining sections of the manuscript are organized as follows. “Materials and methods” elaborates on the proposed algorithm in detail. “Results” shows the details and results of the simulation experiments. “Conclusions” concludes this work and presents future works.
Materials and methods
Behaviour of hermit crab
Hermit crabs are arthropods similar to shrimp and crabs that live mainly in coastal areas. They are also omnivorous and are known as the “scavengers” of the seashore, eating everything from algae and food scraps to parasites, and they play an essential role in the ecological balance. However, hermit crabs rely heavily on their houses for survival, and years of research have shown that proper houses help hermit crabs survive, feed and resist predators, and if hermit crabs lose their houses, the soft tissue structures of their abdomens become exposed and unprotected. Hermit crabs may die if they live in unsuitable houses or have no houses to live in for a long time. As they grow, hermit crabs are continuously searching and acquiring houses that are appropriate for their survival. Its population behaviour of searching for, and changing to, new houses is a unique natural process. The hermit crabs search for a proper house to survive in their surrounding location or host an aged house that other crabs have shed. If the hermit crab is unable to find a suitable new house, it must return to its original house.
Hermit crab optimization algorithm
Inspired by the constant house-searching and house-changing behaviour of hermit crabs, we idealize the characteristics of hermit crabs’ behaviours. Relating the process of hermit crabs’ house searching and house changing to the objective function to be optimized, we are able to design a hermit crab-inspired optimization algorithm (the HCOA). In hermit crab populations, there are many factors involved in selecting the right house, including size, species and colour. In the HCOA, for simplicity, we assume that each hermit crab has no mass or volume, and represents only a point in space, and each point is a solution to a certain problem. The fitness of each hermit crab for a new house is associated with the optimal value of the target function, and an adaptation degree analogy is associated with the house. Because of the large and variable distribution of crustaceans in coastal areas, we randomly generate a large number of houses in the HCOA. Based on the behaviour of the hermit crabs, we use two house-searching and house-changing rules, which are denoted as the optimal search and the historical path search. These two strategies can help the HCOA balance the breadth searches and width searches in a high-dimensional search space, and increase the possibility of jumping out of a local optimum. The search diagram for the HCOA is shown in Fig. 1. At the same time, the basic steps of the HCOA are summarized using the pseudocode displayed in algorithm 1, and the HCOA flowchart is displayed in Fig. 2. The two the HCOA search strategies create only linear transformations in the time complexity. Therefore, the time complexity of an the HCOA is still linear complexity O(n).

Optimal search
The alpha hermit crab of the crab population gains more valuable survival experience than the other hermit crabs, and it is more experienced in finding a new house. Therefore, other hermit crabs are more likely to find more appropriate houses in the vicinity of the population’s alpha hermit crab. If other hermit crabs find a more appropriate house than the one it currently has, it changes houses. By comparison, if it does not find a more suitable house, it continues to use the original house in order to survive. In the HCOA, after each calculation of the function fitness, the fitness of all the hermit crabs is ranked. The hermit crab with the best fitness is selected for comparison with the alpha hermit crab. If a hermit crab with the best fitness is better, then it is more experienced in survival than the existing alpha hermit crab. The optimal search process is summarized in the pseudocode shown in Algorithm 2. With the guidance of this rule, the HCOA can accurately find the optimal solution.
In the (t)th generation, the alpha hermit crab finds the most appropriate house position \(Alpha^t\). \(Pbest^{t}(\gamma )\) means that each hermit crab in the population finds the (t)th most appropriate house’s position, and \(Pcandidate^{t+1}(\gamma )\) is the \((t + 1)\)th candidate house’s position. \(GD(\alpha ,\delta )\) is a Gaussian distribution with mean \(\alpha\) and standard deviation \(\delta\), which is used to simulate the distribution of the houses.

Historical path search
The alpha hermit crab from the entire crab population is replaced when other hermit crabs have a more appropriate house than the alpha hermit crab. However, each generation of alpha hermit crabs in the population sheds its original house when it seeks a more appropriate house. The original house remains in place, while the alpha hermit crab replaces it with a more appropriate house. These original houses may be used by other hermit crabs. It is also possible that more appropriate houses exist around these houses for other hermit crabs to live in. The houses abandoned by alpha hermit crabs change with the environment and hermit crab behaviours. On the one hand, they may simply disappear; on the other hand, they may appear near their original location. Therefore, other hermit crabs want to find a more suitable house. In the HCOA, other hermit crabs search around the historical path of the five houses where the alpha hermit crab has recently left, because there may be a better chance of finding a house that suits them. A historical path means that the HCOA has deeper search spaces. A hermit crab may find a better house nearby on the five historical paths and attain a house replacement. This search process increases the HCOA width search in high-dimensional space. If a more suitable shell is not found, the hermit crab returns to its original shell. The historical path search process can be summarized in the pseudocode shown in Algorithm 3.
In the (t)th generation, the best personal position for each hermit crab is \(Pbest^{t}(\gamma )\). By the definition of the HCOA, \(\omega =(1, 5)\) is the alpha hermit crab in most recent history to shed the first few houses, and the population’s current alpha hermit crabs keep the houses it recently replaced. We use \(Ghistory^{t}(\omega )\) to record the population’s alpha hermit crabs houses’ historical position. \(Candiate\_pbest^{t+1}(\gamma , \omega )\) means hermit crabs search around the \(\omega\) houses’ positions. \(GD(\beta ,\lambda )\) is a Gaussian distribution with mean \(\beta\) and standard \(\lambda\), which is used to simulate the distribution of houses. F is the indicator test function.

Results
Experimental methods
To reflect the comprehensive performance of the HCOA, we choose the CEC2017 benchmark function34. The CEC2017 benchmark function includes a unimodal function (\(f1-f2\)), simple multipeak function (\(f3-f9\)), hybrid function (\(f10-f19\)) and composition function (\(f20-f29\)). The test dimensions are 10, 30, 50 and 100. The highest dimension of 100 recommended in the CEC2017 benchmark function is chosen to show the reasonableness of the experiment. Five well-known parameter-free mate-heuristics, BBPSO35, PBBPSO36, DLSBBPSO37, TBBPSO38 and ETBBPSO39 are used as comparison groups. To reduce the impact of chance errors on the experimental results, all the trials are attempted 37 times. All the algorithms have a population size of 100 and a maximum number of iterations of 1.00E+4 and use the same settings as in the original paper.
Experimental results
To better analyse the experimental results, GT is used to measure the performance of each algorithm. In this work, GT is defined as \(\left\| Gobaloptimum-theoreticaloptimum \right\|\).
Specific numerical results, including the mean value (MV) and standard deviation (Std) of 37 independent runs, are displayed in Tables 1 and 2. The Friedman statistic test is used to analyse the results. The rank results (RRs) are also shown in Tables 1 and 2. The average rank point of the HCOA is 1.4828, which is 56.121% better than the second ranked algorithm BBPSO. the HCOA provides a solution to high-dimensional optimization problems. The average ranks are shown at the bottom of Table 2. The results of the first ranking of the HCOA out of 29 benchmarking functions in CEC2017 are in Table 3, and the remaining ranking results are in Table 4.
To demonstrate the convergence performance of the HCOA, the GT in different iterations for the HCOA, BBPSO, PBBPSO, DLSBBPSO, TBBPSO and ETBBPSO are also shown in Figs. 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30 and 31. The scale on the vertical axis represents the value of CE. The horizontal coordinate denotes the number of generations, and the vertical coordinate denotes the value of GT.
After comparing and counting, HCAO ranks first in the number of functions among the 29 benchmarked functions in CEC2017 with 23, ranked second, third, and fifth with two each, and none ranked fourth and sixth. And the ranking shows that HCAO has a better performance at simple multipeak function (f3–f9), hybrid function (f10–f19), and composition function than other algorithms.
According to Figs. 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30 and 31, except for f1, f8, f9, f13, f19, f24 and f25, the HCOA is significantly better than the other algorithms in terms of convergence speed and accuracy. The time complexity of the five optimization algorithms for the HCOA and the control group are linearly transformed by addition and subtraction without changing the order of magnitude of the time complexity. Therefore, the time complexity of the HCOA and the other optimization algorithms are the same O(n).
Comparison with the new parameter-free algorithm
To further prove the superiority of the HCOA algorithm in high-dimensional optimization problems, we choose the state-of-the-art method, the BPSO-CM40 algorithm, as the control group, to conduct experiments on the highest dimension of 100 recommended by CEC2017. To minimize the effect of chance errors on the experimental results, all the trials are attempted 37 times with a population size of 100 and a maximum number of iterations of 1.00E+4. In addition, the overall effectiveness (OE) of the HCOA and BPSO-CM is computed by the results in Tables 5 and 6. The OE is calculated by Eq. (4).
where N is the number of benchmark functions, and L represents the target algorithm loss in the competition. The OE results are shown in Table 5. The results indicate that the HCOA has the best performance.
It can be seen from Tables 5 and 6 that the HCOA performs better than BPSO-CM in 20 functions. Meanwhile, the OE of the HCOA reaches 68.97%, which is 37.94% higher than the 31.03% of BPSO-CM. The experimental results show that the HCOA can provide a high precision solution for single objective high-dimensional optimization problems.
Conclusions
A novel hermit crab optimization algorithm (the HCOA) that produces high-precision results for high-dimensional optimization problems is proposed in this paper. the HCOA achieves high-accuracy resolution of single-objective optimization problems by modelling the behaviour of hermit crab populations. The optimal search and the historical path search are used in the HCOA to balance the depth search and breadth search. The cooperation of the optimal search and historical path search achieves high-precision optimization in high-dimensional spaces. Moreover, both the optimal search and the historical path search have linear computation times, which means that the time complexity of the HCOA is O(n).
In the experimental part of this paper, the CEC2017 benchmark functions are used. In a total of 29 test functions, the HCOA scores 23 firsts. Compared with the state-of-the-art BBPSO-based method, BPSO-CM and the HCOA win 20 of 29 tests. All the experimental results demonstrate that the HCOA generates highly accurate and robust results for high-dimensional optimization problems.
However, the HCOA cannot be applied to multiobjective optimization problems and single-objective noncontinuous optimization problems. Furthermore, in the unimodal functions of CEC2017, the performance of the HCOA is inferior to that of BPSO-CM. Therefore, one of the main future research directions is applying the HCOA to multiobjective optimization problems. Additionally, due to the linear time complexity, combining the HCOA with other famous evolutionary strategies, such as SE and PSO, to achieve higher accuracy and greater robustness is another solid option.
Data availability
All data generated or analysed during this study are included in this published article and https://github.com/GuoJia-Lab-AI/crab.
References
Grefenstette, J. J. Genetic algorithms and machine learning 3–4. https://doi.org/10.1145/168304.168305 (1993).
Storn, R. & Price, K. Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11(4), 341–359 (1997).
He, S., Wu, Q. H. & Saunders, J. R. Group search optimizer: An optimization algorithm inspired by animal searching behavior. IEEE Trans. Evol. Comput. 13(5), 973–990. https://doi.org/10.1109/TEVC.2009.2011992 (2009).
Kennedy, J. & Eberhart, R. Particle swarm optimization. IEEE Int. Conf. Neural Netw. Conf. Proc. 4(1), 1942–1948. https://doi.org/10.4018/ijmfmp.2015010104 (1995).
Karaboga, D. & Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Global Optim. 39(3), 459–471. https://doi.org/10.1007/s10898-007-9149-x (2007).
Yang, X.-S. Firefly algorithms for multimodal optimization. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) vol. 5792 LNCS, pp. 169–178 (2009). https://doi.org/10.1007/978-3-642-04944-6_14
Yang, X. S. & Deb, S. Cuckoo search via lévy flights. https://doi.org/10.1109/NABIC.2009.5393690 (2009).
Meng, X., Liu, Y., Gao, X. & Zhang, H. A new bio-inspired algorithm: Chicken swarm optimization. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 8794. https://doi.org/10.1007/978-3-319-11857-4_10 (2014).
Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 27(4), 1053–1073. https://doi.org/10.1007/s00521-015-1920-1 (2016).
Yazdani, M. & Jolai, F. Lion Optimization Algorithm (LOA): A nature-inspired metaheuristic algorithm. J. Comput. Des. Eng. 3(1), 24–36. https://doi.org/10.1016/j.jcde.2015.06.003 (2016).
Dorigo, M. & Gambardella, L. M. Ant colony system: A cooperative learning approach to the traveling salesman problem. IEEE Trans. Evol. Comput. 1(1), 53–66. https://doi.org/10.1109/4235.585892 (1997).
Mirjalili, S. & Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 95, 51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008 (2016).
You, A., & Zhang, L. Transportation vehicle scheduling optimization method based on improved multi-layer coding genetic algorithm. In: The 2nd International Conference on Computing and Data Science, vol. PartF16898, pp. 1–6. ACM, New York, NY, USA. https://doi.org/10.1145/3448734.3450840 (2021).
Kwiecień, J. & Pasieka, M. Cockroach Swarm optimization algorithm for travel planning. Entropy 19(5), 213. https://doi.org/10.3390/e19050213 (2017).
Jia, Y.-H., Mei, Y. & Zhang, M. A bilevel ant colony optimization algorithm for capacitated electric vehicle routing problem. IEEE Trans. Cybern. 52(10), 10855–10868. https://doi.org/10.1109/TCYB.2021.3069942 (2022).
Mehmood, K., Chaudhary, N.I., Khan, Z.A., Cheema, K.M. & Raja, M.A.Z. Variants of chaotic grey wolf heuristic for robust identification of control autoregressive model. Biomimetics 8(2) (2023). https://doi.org/10.3390/biomimetics8020141
Mehmood, K., Chaudhary, N.I., Khan, Z.A., Cheema, K.M., Raja, M.A.Z., Milyani, A.H. & Azhari, A.A. Dwarf mongoose optimization metaheuristics for autoregressive exogenous model identification. Mathematics 10(20) (2022). https://doi.org/10.3390/math10203821
Mehmood, K., Chaudhary, N.I., Khan, Z.A., Cheema, K.M., Raja, M.A.Z., Milyani, A.H. & Azhari, A.A. Nonlinear hammerstein system identification: A novel application of marine predator optimization using the key term separation technique. Mathematics 10(22). https://doi.org/10.3390/math10224217 (2022).
Mehmood, K., Chaudhary, N.I., Khan, Z.A., Raja, M.A.Z., Cheema, K.M. & Milyani, A.H. Design of aquila optimization heuristic for identification of control autoregressive systems. Mathematics 10(10) (2022). https://doi.org/10.3390/math10101749
Ding, Y. et al. A whale optimization algorithm-based cellular automata model for urban expansion simulation. Int. J. Appl. Earth Obs. Geoinf. 115(October), 103093. https://doi.org/10.1016/j.jag.2022.103093 (2022).
Sato, M., Fukuyama, Y., Iizaka, T. & Matsui, T. Total optimization of energy networks in a smart city by multi-population global-best modified brain storm optimization with migration. Algorithms 12(1), 15. https://doi.org/10.3390/a12010015 (2019).
Lakhan, A. et al. Federated-learning based privacy preservation and fraud-enabled blockchain iomt system for healthcare. IEEE J. Biomed. Health Inform. 27(2), 664–672. https://doi.org/10.1109/JBHI.2022.3165945 (2023).
M, P., Malviya, M., Hamdi, M., V, V., Mohammed, M.A., Rauf, H.T., & Al-Dhlan, K.A. 5g based blockchain network for authentic and ethical keyword search engine. IET Commun. 16(5), 442–448. https://doi.org/10.1049/cmu2.12251 (2022).
Lakhan, A. et al. Federated learning-aware multi-objective modeling and blockchain-enable system for iiot applications. Comput. Electr. Eng. 100, 107839. https://doi.org/10.1016/j.compeleceng.2022.107839 (2022).
Gaba, P., Raw, R. S., Mohammed, M. A., Nedoma, J. & Martinek, R. Impact of block data components on the performance of blockchain-based vanet implemented on hyperledger fabric. IEEE Access 10, 71003–71018. https://doi.org/10.1109/ACCESS.2022.3188296 (2022).
Iqbal, R., Doctor, F., More, B., Mahmud, S. & Yousuf, U. Big data analytics: Computational intelligence techniques and application areas. Technol. Forecast. Soc. Chang. 153, 119253. https://doi.org/10.1016/j.techfore.2018.03.024 (2020).
Zhou, L., Pan, S., Wang, J. & Vasilakos, A. V. Machine learning on big data: Opportunities and challenges. Neurocomputing 237, 350–361. https://doi.org/10.1016/j.neucom.2017.01.026 (2017).
Arif, M., Di Persio, L., Kumam, P., Watthayu, W. & Akgül, A. Heat transfer analysis of fractional model of couple stress Casson tri-hybrid nanofluid using dissimilar shape nanoparticles in blood with biomedical applications. Sci. Rep. 13(1), 4596. https://doi.org/10.1038/s41598-022-25127-z (2023).
Farooq, U. et al. Computational framework of cobalt ferrite and silver-based hybrid nanofluid over a rotating disk and cone: a comparative study. Sci. Rep. 13(1), 5369. https://doi.org/10.1038/s41598-023-32360-7 (2023).
Farooq, U. et al. A computational fluid dynamics analysis on Fe3O4-H2O based nanofluid axisymmetric flow over a rotating disk with heat transfer enhancement. Sci. Rep. 13(1), 4679. https://doi.org/10.1038/s41598-023-31734-1 (2023).
Li, Y., Zhao, Y. & Liu, J. Dimension by dimension dynamic sine cosine algorithm for global optimization problems. Appl. Soft Comput. 98, 106933. https://doi.org/10.1016/j.asoc.2020.106933 (2021).
Yang, Q., Zhu, Y., Gao, X., Xu, D. & Lu, Z. Elite directed particle swarm optimization with historical information for high-dimensional problems. Mathematics 10(9), 1384. https://doi.org/10.3390/math10091384 (2022).
Chen, G. et al. Efficient hierarchical surrogate-assisted differential evolution for high-dimensional expensive optimization. Inf. Sci. 542, 228–246. https://doi.org/10.1016/j.ins.2020.06.045 (2021).
Awad, N.H., Ali, M.Z., Liang, J., Qu, B.Y. & Suganthan, P.N. Problem definitions and evaluation criteria for the cec 2017 special session and competition on real-parameter optimization. Nanyang Technol. Univ., Singapore, Tech. Rep, 1–34 (2016).
Kennedy, J. Bare bones particle swarms. 2003 IEEE Swarm Intelligence Symposium, SIS 2003 - Proceedings, 80–87. https://doi.org/10.1109/SIS.2003.1202251 (2003).
Guo, J. & Sato, Y. A pair-wise bare bones particle swarm optimization algorithm for nonlinear functions. Int. J. Network. Distrib. Comput. 5, 143–151. https://doi.org/10.2991/ijndc.2017.5.3.3 (2017).
Guo, J. & Sato, Y. A bare bones particle swarm optimization algorithm with dynamic local search. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 10385 LNCS, 158–165. https://doi.org/10.1007/978-3-319-61824-1_17 (2017).
Guo, J. et al. A twinning bare bones particle swarm optimization algorithm. PLoS ONE 17, 1–30. https://doi.org/10.1371/journal.pone.0267197 (2022).
Tian, H., Guo, J., Xiao, H., Yan, K. & Sato, Y. An electronic transition-based bare bones particle swarm optimization algorithm for high dimensional optimization problems. PLoS ONE 17, 1–23. https://doi.org/10.1371/journal.pone.0271925 (2022).
Guo, J. et al. A bare-bones particle swarm optimization with crossed memory for global optimization. IEEE Access 11, 31549–31568. https://doi.org/10.1109/ACCESS.2023.3250228 (2023).
Acknowledgements
We would like to acknowledge the support of the Natural Science Foundation of China (No. 52201363), Natural Science Foundation of Hubei Province (No. 2020CFB306 and No. 2019CFB778), Hubei Provincial Education Department Scientific Research Program Project (No. Q20222202), and the Ideological and Political Department Project of Hubei Province (No. 21Q210).
Author information
Authors and Affiliations
Contributions
J.G.: conceptualization, resources, software, and writing—review and editing. G.Z.: measurement, investigation,methodology, and software. B.S.: investigation, methodology, and software. Y.D.: conceptualization,formal analysis, funding acquisition, methodology, project administration, resources, software, supervision, and writing review and editing. Y.K.: conceptualization, measurement, methodology. Y.S. writing review and editing. All authors have read and agreed to the published version of the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Guo, J., Zhou, G., Yan, K. et al. A novel hermit crab optimization algorithm. Sci Rep 13, 9934 (2023). https://doi.org/10.1038/s41598-023-37129-6
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1038/s41598-023-37129-6
This article is cited by
-
Metaheuristic Algorithms Since 2020: Development, Taxonomy, Analysis, and Applications
Archives of Computational Methods in Engineering (2025)
-
Groupers and moray eels (GME) optimization: a nature-inspired metaheuristic algorithm for solving complex engineering problems
Neural Computing and Applications (2025)
-
A novel breast cancer image classification model based on multiscale texture feature analysis and dynamic learning
Scientific Reports (2024)
-
Hippopotamus optimization algorithm: a novel nature-inspired optimization algorithm
Scientific Reports (2024)

































