Table 1 Summary of existing works.
Related work | Methodology | Strength | Weakness |
|---|---|---|---|
Hu et al. 48 | Enhanced hybrid AOA (CSOAOA) | Improves exploitation, avoids local optima, increases convergence accuracy | Imbalanced exploration-exploitation despite improved accuracy, potential for slow convergence in high-dimensional problems |
Shen et al. 49 | Multi-population evolved WOA (MEWOA) | Increases convergence speed, avoids local optima, competitive performance | May face challenges in extremely high-dimensional problems despite improved convergence speed |
Qiao et al. 50 | Hybrid AOA-HHO for MTIS | Improves segmentation accuracy, PSNR, SSIM, execution time | Focused on image segmentation, lacks general applicability across other domains |
Qiu et al. 51 | Improved Gray Wolf Optimization (IGWO) | Improves convergence speed, solution accuracy, escapes local minima | Requires fine-tuning to maintain performance across diverse problems, risk of local optima |
Houssein et al. 52 | Improved Sooty Tern Optimization Algorithm (mSTOA) | Balances exploration/exploitation, avoids sub-optimal convergence | Control parameter sensitivity may lead to inconsistent performance in complex cases |
Wu et al. 53 | Variant Ant Colony Optimization (MAACO) | Reduces path length, turn times, improves convergence speed | High computational cost for large-scale problems despite improved path planning performance |
Nadimi-Shahraki et al. 54 | Enhanced Whale Optimization Algorithm (E-WOA) | Improves population diversity and search strategy | Struggles with maintaining balance in multi-objective tasks, relies heavily on parameter adjustment |
Askr et al. 55 | Binary Enhanced Golden Jackal Optimization (BEGJO) | Boosts exploration and exploitation, outperforms in classification accuracy | Computationally expensive, may not generalize well to larger datasets despite classification improvements |
Ozkaya et al. 56 | Adaptive Fitness-Distance Balance ARO (AFDB-ARO) | Balances exploration/exploitation, achieves optimal solutions | May struggle with large-scale problems despite performance in benchmark tests |
Yıldız et al. 57 | Hybrid AOA-NM | Improves solution quality, avoids local optima traps | Limited applicability outside constrained design problems |
Deng et al. 58 | Improved Whale Optimization Algorithm (IWOA) | Improves convergence speed, stability, accuracy | Challenges in dealing with complex constraints, potential slow convergence |
Tan and Mohamad-Saleh 59 | Hybrid Equilibrium Whale Optimization Algorithm (EWOA) | Superior statistical performance, convergence rate, robustness | Improved robustness but limited efficiency in more complex, high-dimensional spaces |
Mahajan et al. 60 | Hybrid AO-AOA | Effective in high- and low-dimensional problems | limited exploration in certain complex tasks |
Qian et al. 61 | Hybrid SSACO | Avoids local optima, improves convergence accuracy | Limited exploration capabilities in high-dimensional, non-convex problems |
Zhu et al. 62 | Enhanced Dung Beetle Optimization (QHDBO) | Improves convergence speed, accuracy, robustness | Still prone to local optima in extremely challenging problems despite overall improvements |