Abstract
Advances in AI have transformed the traditional Internet of Things (IoT) into smarter, more versatile next-generation devices. Enhanced connectivity among sensors, actuators, and appliances boosts data availability and resource control in Internet of Things networks. However, the increasing number of decentralized IoT devices leads to exponential growth in service requests, creating significant adaptation challenges. Machine learning has become essential in IoT applications, and hyperparameter optimization is crucial to model performance. Existing optimization methods face challenges such as high uncertainty, poor model generalization, and high computational costs. This study focuses on two key aspects of hyperparameter optimization for fog computing node performance prediction: cross-validation techniques and search space methods. In this research, a new optimization approach, Good Point Set Stepwise Shrinkage, is proposed to reduce uncertainty, enhance model generalization, and lower computational costs. Using fog computing performance data from Internet of Things devices, the proposed scheme optimized hyperparameters for Support Vector Machine, Back Propagation Network, and Convolutional Neural Network models. The test results show Mean Squared Error of 4.061, 4.114, and 3.963 ± 0.0323, respectively, for the Support Vector Machine, Back Propagation Network, and Convolutional Neural Network models. Good Point Set Stepwise Shrinkage resolves cross-validation uncertainties and improves randomness in the search space. Compared with Sequential Uniform Designs methods, Good Point Set Stepwise Shrinkage offers a simpler approach to constructing a unified search space. Good Point Set Stepwise Shrinkage becomes a highly suitable approach for hyperparameter optimization in fog computing performance prediction for the Internet of Things.
Similar content being viewed by others
Data availability
The datasets generated during the current study are available in the CPU-task-execution dataset repository [https://github.com/forsaken0214/cpu-task-execution].
Program availability
The program generated during the current study are available [https://github.com/forsaken0214/A-New-Good-Point-Set-Stepwise-Shrinkage-Optimization-in-Machine-Learning-Model].
References
Consumer electronics, [online] Available: https://www.statista.com/outlook/dmo/ecommerce/electronics/consumer-electronics/worldwide (2022).
Wu, C. K., Cheng, C.-T., Uwate, Y., Chen, G., Mumtaz S., Tsang, K. F. State-of-the-art and research opportunities for next-generation consumer electronics, IEEE Trans. Consum. Electron., (2022).
Ghazal, T. M. et al. IoT for smart cities: Machine learning approaches in smart healthcare—A review. Future Int. 13(8), 218 (2021).
Ghazal, T. M. et al. Internet of things connected wireless sensor networks for smart cities[M]//The effect of information technology on business and marketing intelligence systems 1953–1968 (Springer International Publishing, 2023).
Ali, E. S. et al. Machine learning technologies for secure vehicular communication in internet of vehicles: Recent advances and applications. Secur. Commun. Netw. 2021(1), 8868355 (2021).
Hassan, R. et al. Internet of Things and its applications: A comprehensive survey. Symmetry 12(10), 1674 (2020).
Hasan, M. K. et al. A novel segmented random search based batch scheduling algorithm in fog computing. Comput. Hum. Behav. https://doi.org/10.1016/j.chb.2024.108269 (2024).
Feoktistov, A. G., Basharina, O. Predicting runtime of computational jobs in distributed computing environment [J]. Crossref, (2020).
Hasan, M. K. et al. A novel segmented random search based batch scheduling algorithm in fog computing. Comput. Hum. Behav. 158, 108269 (2024).
Bo, Z., Hasan, M. K., Weichen, Z., Sundararajan E., Abbas, H. S. Management framework for the internet of consumer electronics, In IEEE Consumer Electronics Magazine, https://doi.org/10.1109/MCE.2025.3615266.
Ghazal, T. M. et al. Machine learning approaches for sustainable cities using internet of things[M]//The Effect of Information Technology on Business and Marketing Intelligence Systems 1969–1986 (Springer International Publishing, 2023).
Morales-Hernández, A., Nieuwenhuyse, I. V. & Gonzalez, S. R. A survey on multi-objective hyperparameter optimization algorithms for machine learning. Artif. Intell. Rev. 56(8), 8043–8093. https://doi.org/10.1007/s10462-022-10359-2 (2022).
Thomas du Toit, D. F., Volker L. D. Cross-platform hyperparameter optimization for machine learning interatomic potentials. J. Chem. Phys.159 (2) (2023).
Dhanka, S. & Maini, S. A hybridization of XGBoost machine learning model by Optuna hyperparameter tuning suite for cardiovascular disease classification with significant effect of outliers and heterogeneous training datasets. Int. J. Cardiol. 420, 132757 (2025).
Ahmed, W. et al. Hyperparameter optimization of machine learning models using grid search for Twitter sentiment analysis. In Humanizing Technology with Emotional Intelligence 419–432 (IGI Global Scientific Publishing, 2025).
Salehin, I. et al. AutoML: A systematic review on automated machine learning with neural architecture search. J. Inf. Intell. 2(1), 52–81 (2024).
Zhao, Y., Zhang, W. & Liu, X. Grid search with a weighted error function: Hyper-parameter optimization for financial time series forecasting. Appl. Soft Comput. 154, 111362 (2024).
Rimal, Y., Sharma, N. & Alsadoon, A. The accuracy of machine learning models relies on hyperparameter tuning: Student result classification using random forest, randomized search, grid search, Bayesian, genetic, and Optuna algorithms. Multimed. Tools Appl. 83(30), 74349–74364 (2024).
Albahli, S. Efficient hyperparameter tuning for predicting student performance with Bayesian optimization. Multimed. Tools Appl. 83(17), 52711–52735 (2024).
Mumtahina, U., Alahakoon, S. & Wolfs, P. Hyperparameter tuning of load-forecasting models using metaheuristic optimization algorithms—A systematic review. Mathematics 12(21), 3353 (2024).
Wang, Y., et al. A hyperparameter optimization method based on statistical orthogonal design for neural network models. International Conference on Pattern Recognition. (Springer, Cham, 2025).
Joy, G., Huyck, C. & Yang, X.-S. Parameter tuning of the firefly algorithm by three tuning methods: Standard Monte Carlo, quasi-Monte Carlo and Latin hypercube sampling methods. J. Comput. Sci. https://doi.org/10.1016/j.jocs.2025.102588 (2025).
Prasad, K. S. et al. Augmenting cybersecurity through attention based stacked autoencoder with optimization algorithm for detection and mitigation of attacks on IoT assisted networks. Sci. Rep. 14(1), 30833 (2024).
Rodrigues, T. S., Placido R. P. Hyperparameter optimization in generative adversarial networks (GANs) using Gaussian AHP. IEEE Access (2024).
Malakouti, S. M. Babysitting hyperparameter optimization and 10-fold-cross-validation to enhance the performance of ML methods in predicting wind speed and energy generation. Intell. Syst. Appl. 19, 200248 (2023).
Yazici, I., Emre, G. NR-V2X quality of service prediction through machine learning with nested cross-validation scheme. 2024 6th International Conference on Communications, Signal Processing, and their Applications (ICCSPA). IEEE, (2024).
Sonkavde, G. et al. Forecasting stock market prices using machine learning and deep learning models: A systematic review, performance analysis and discussion of implications. Int. J. Financ. Stud. 11(3), 94 (2023).
Syu, J.-H. et al. A comprehensive survey on artificial intelligence empowered edge computing on consumer electronics. IEEE Trans. Consum. Electron. https://doi.org/10.1109/tce.2023.3318150 (2023).
Mabuni, D. & Babu, S. A. High accurate and a variant of k-fold cross validation technique for predicting the decision tree classifier accuracy. Int. J. Innovat. Technol. Explor. Eng. 10(2), 105–110. https://doi.org/10.35940/ijitee.C8403.0110321 (2021).
Machiraju, J., Rao, S. N. Effect of K-fold cross validation on Mri brain images using support vector machine algorithm. Int. J. Recent Technol. Eng. 2021.
Ming, J. L. K. et al. Artificial neural network topology optimization using K-fold cross validation for spray drying of coconut milk. IOP Conf. Series: Mater. Sci. Eng. 778(1), 012094. https://doi.org/10.1088/1757-899X/778/1/012094 (2020).
Li, H. et al. Differential evolution particle swarm optimization algorithm based on good point set for computing Nash equilibrium of finite noncooperative game. AIMS Math. 6(2), 1309–1323. https://doi.org/10.3934/math.2021081 (2021).
Zhang, P. Research on convergence of artificial bee colony algorithm based on crossover and consistency distribution–good point set. (IOP Conference Series: Earth and Environmental Science IOP Publishing, 2020).
Fang, K. T. & Wang, Y. A sequential algorithm for optimization and its applications to regression analysis. In LectureNotes in Contemporary Mathematics 17–28 (Science Press, 1990).
Yang, Y., Deng, K., Zhu, M. Multi-level training and Bayesian optimization for economical hyperparameter optimization. (2020).https://doi.org/10.48550/arXiv.2007.09953
Nurkholis, A., Styawati, S. & Suhartanto, A. Firefly algorithm for SVM multi-class optimization on soybean land suitability analysis. JOIV Int. J. Informatics Vis. 8(2), 592–597 (2024).
Ghezelbash, R. et al. "Genetic algorithm to optimize the SVM and K-means algorithms for mapping of mineral prospectivity. Neural Comput. Appl. 35(1), 719–733 (2023).
Wang, Q. & Wang, X. Parameters optimization of the heating furnace control systems based on BP neural network improved by genetic algorithm. J. Internet Things 2(2), 75–80 (2020).
Sinha, A., Gunwal, S., Kumar, S. A globally convergent gradient-based Bilevel Hyperparameter optimization method. (2022). https://doi.org/10.48550/arXiv.2208.12118.
Manorathna, R. k-fold cross-validation explained in plain English (For evaluating a model’s performance and hyperparameter tuning)[EB/OL].(2020)
Yang, Z., Zhang, A. Hyperparameter optimization via sequential uniform designs. (2020).https://doi.org/10.48550/arXiv.2009.03586.
Jamil, M. & Yang, X. S. A literature survey of benchmark functions for global optimization problems. Int. J. Math. Model. Numer. Optim. 4(2), 150–194. https://doi.org/10.1504/IJMMNO.2013.055204 (2013).
Yang, L. & Shami, A. On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415, 295–316. https://doi.org/10.1016/j.neucom.2020.07.061 (2020).
. Boulesteix, A. L., Bischl, B., Deng, D, et al. Hyperparameter optimization: foundations, algorithms, best practices and open challenges. (2021).
Li, Y., Zhang, Y. & Cai, Y. A new hyper-parameter optimization method for power load forecast based on recurrent neural networks[J]. Algorithms 14(6), 163 (2021).
da Silva Santos, C. E., Sampaio, R. C., dos Santos, C. L., Bestard, G. A. & Llanos, C. H. Multi-objective adaptive differential evolution for SVM/SVR hyperparameters selection. Pattern Recog. https://doi.org/10.1016/j.patcog.2020.107649 (2020).
Yan, Y. Performance prediction by an SVM with a firefly optimization method. Int. J. Reliab. Qual. Saf. Eng. https://doi.org/10.1142/S0218539321500170 (2020).
Chang, B., Huang, J. Discrimination of molten pool penetration based on genetic algorithm optimization of BP neural network[C]//IOP Publishing Ltd. 012110 (IOP Publishing Ltd, 2020). https://doi.org/10.1088/1742-6596/1437/1/012110.
Shan-Kun, N., Yu-Jia, W., Shanli, X., et al. A hybrid of particle swarm optimization and genetic algorithm for training back-propagation neural Network. (2023).
Guo, B., Hu, J. & Wu, W. The Tabu_Genetic algorithm: a novel method for hyper-parameter optimization of learning algorithms. Electronics 8(5), 579 (2019).
Du, X., Xu, H. & Zhu, F. Understanding the effect of hyperparameter optimization on machine learning models for structure design problems. Comput. Aided Des. 135(5), 103013. https://doi.org/10.1016/j.cad.2021.103013 (2021).
Funding
This work has been supported by the Universiti Kebangsaan Malaysia, under DIP 2024–033.
Author information
Authors and Affiliations
Contributions
Zhang Bo: Conceptualize, methodology, software, Manuscript draft and review, validation, visualization; Mohammad Kamrul Hasan, Elankovan A. Sundararajan: Conceptualize, methodology, analysis, supervision, funding, review and editing; Shayla Islam, Peiying Zhang, Fatima Rayan Awad Ahmed, Nissrein Babiker Mohammed Babiker: Conceptualize, Data analysis, visualization, validation, and review and editing.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Bo, Z., Hasan, M.K., Sundararajan, E.A. et al. A new good point set stepwise shrinkage optimization in machine learning model for fog node performance prediction. Sci Rep (2026). https://doi.org/10.1038/s41598-026-41630-z
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-026-41630-z


