Table 4 Comparative summary of existing hybrid ANFIS-based models and the proposed GEP–ANFIS framework124,125,126.
Model | Optimization Strategy | Structural Adaptation | Learning Mechanism | Computational Complexity | Limitations | Distinct Contribution of Proposed GEP–ANFIS |
|---|---|---|---|---|---|---|
ANFIS–GA | Genetic Algorithm for parameter tuning | Fixed fuzzy structure; only parameters optimized | Global stochastic search + ANFIS gradient fine-tuning | Moderate–High | Prone to local minima, slow convergence | Introduces symbolic rule evolution via GEP; enables both rule and parameter optimization |
ANFIS–PSO | Particle Swarm Optimization | Fixed fuzzy structure | Particle-based swarm exploration + ANFIS adaptation | Moderate | Sensitive to initial conditions; lacks interpretability | GEP evolves interpretable rule expressions before ANFIS refinement; improved generalization |
ANFIS–XGBoost | Gradient-boosted regression for feature optimization | No fuzzy structure adaptation; black-box ensemble | Gradient boosting with tree ensembles | High | High data requirement; poor explainability | Combines symbolic regression (GEP) with fuzzy reasoning (ANFIS) for transparent hybrid optimization |
ANFIS–PSO–GA | Dual metaheuristic parameter tuning | Fuzzy structure static | Hybrid PSO–GA + ANFIS local learning | Very High | Computationally expensive; unstable for real-time use | Reduces computational cost via zero-order GEP optimization requiring fewer iterations |
Proposed GEP–ANFIS | Gene Expression Programming for structure evolution + ANFIS hybrid learning | Dynamic rule evolution and adaptive membership functions | Symbolic regression (GEP) + local hybrid learning (ANFIS) | Low–Moderate | None observed; scalable with fewer samples | Dual-layer optimization (structural + parametric), interpretable rules, scalable to dynamic industrial EMS |