Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Reports
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific reports
  3. articles
  4. article
A new good point set stepwise shrinkage optimization in machine learning model for fog node performance prediction
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 18 March 2026

A new good point set stepwise shrinkage optimization in machine learning model for fog node performance prediction

  • Zhang Bo1,6,
  • Mohammad Kamrul Hasan1,
  • Elankovan A. Sundararajan1,
  • Shayla Islam2,
  • Peiying Zhang3,4,
  • Fatima Rayan Awad Ahmed5 &
  • …
  • Nissrein Babiker Mohammed Babiker6 

Scientific Reports , Article number:  (2026) Cite this article

  • 1043 Accesses

  • 2 Altmetric

  • Metrics details

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Electrical and electronic engineering
  • Mathematics and computing

Abstract

Advances in AI have transformed the traditional Internet of Things (IoT) into smarter, more versatile next-generation devices. Enhanced connectivity among sensors, actuators, and appliances boosts data availability and resource control in Internet of Things networks. However, the increasing number of decentralized IoT devices leads to exponential growth in service requests, creating significant adaptation challenges. Machine learning has become essential in IoT applications, and hyperparameter optimization is crucial to model performance. Existing optimization methods face challenges such as high uncertainty, poor model generalization, and high computational costs. This study focuses on two key aspects of hyperparameter optimization for fog computing node performance prediction: cross-validation techniques and search space methods. In this research, a new optimization approach, Good Point Set Stepwise Shrinkage, is proposed to reduce uncertainty, enhance model generalization, and lower computational costs. Using fog computing performance data from Internet of Things devices, the proposed scheme optimized hyperparameters for Support Vector Machine, Back Propagation Network, and Convolutional Neural Network models. The test results show Mean Squared Error of 4.061, 4.114, and 3.963 ± 0.0323, respectively, for the Support Vector Machine, Back Propagation Network, and Convolutional Neural Network models. Good Point Set Stepwise Shrinkage resolves cross-validation uncertainties and improves randomness in the search space. Compared with Sequential Uniform Designs methods, Good Point Set Stepwise Shrinkage offers a simpler approach to constructing a unified search space. Good Point Set Stepwise Shrinkage becomes a highly suitable approach for hyperparameter optimization in fog computing performance prediction for the Internet of Things.

Similar content being viewed by others

A secure end-to-end communication framework for cooperative IoT networks using hybrid blockchain system

Article Open access 01 April 2025

A novel intrusion detection framework for optimizing IoT security

Article Open access 18 September 2024

A hybrid federated learning framework with generative AI for privacy-preserving and sustainable security in IOT-enabled smart environments

Article Open access 22 January 2026

Data availability

The datasets generated during the current study are available in the CPU-task-execution dataset repository [https://github.com/forsaken0214/cpu-task-execution].

Program availability

The program generated during the current study are available [https://github.com/forsaken0214/A-New-Good-Point-Set-Stepwise-Shrinkage-Optimization-in-Machine-Learning-Model].

References

  1. Consumer electronics, [online] Available: https://www.statista.com/outlook/dmo/ecommerce/electronics/consumer-electronics/worldwide (2022).

  2. Wu, C. K., Cheng, C.-T., Uwate, Y., Chen, G., Mumtaz S., Tsang, K. F. State-of-the-art and research opportunities for next-generation consumer electronics, IEEE Trans. Consum. Electron., (2022).

  3. Ghazal, T. M. et al. IoT for smart cities: Machine learning approaches in smart healthcare—A review. Future Int. 13(8), 218 (2021).

    Google Scholar 

  4. Ghazal, T. M. et al. Internet of things connected wireless sensor networks for smart cities[M]//The effect of information technology on business and marketing intelligence systems 1953–1968 (Springer International Publishing, 2023).

    Google Scholar 

  5. Ali, E. S. et al. Machine learning technologies for secure vehicular communication in internet of vehicles: Recent advances and applications. Secur. Commun. Netw. 2021(1), 8868355 (2021).

    Google Scholar 

  6. Hassan, R. et al. Internet of Things and its applications: A comprehensive survey. Symmetry 12(10), 1674 (2020).

    Google Scholar 

  7. Hasan, M. K. et al. A novel segmented random search based batch scheduling algorithm in fog computing. Comput. Hum. Behav. https://doi.org/10.1016/j.chb.2024.108269 (2024).

    Google Scholar 

  8. Feoktistov, A. G., Basharina, O. Predicting runtime of computational jobs in distributed computing environment [J]. Crossref, (2020).

  9. Hasan, M. K. et al. A novel segmented random search based batch scheduling algorithm in fog computing. Comput. Hum. Behav. 158, 108269 (2024).

    Google Scholar 

  10. Bo, Z., Hasan, M. K., Weichen, Z., Sundararajan E., Abbas, H. S. Management framework for the internet of consumer electronics, In IEEE Consumer Electronics Magazine, https://doi.org/10.1109/MCE.2025.3615266.

  11. Ghazal, T. M. et al. Machine learning approaches for sustainable cities using internet of things[M]//The Effect of Information Technology on Business and Marketing Intelligence Systems 1969–1986 (Springer International Publishing, 2023).

    Google Scholar 

  12. Morales-Hernández, A., Nieuwenhuyse, I. V. & Gonzalez, S. R. A survey on multi-objective hyperparameter optimization algorithms for machine learning. Artif. Intell. Rev. 56(8), 8043–8093. https://doi.org/10.1007/s10462-022-10359-2 (2022).

    Google Scholar 

  13. Thomas du Toit, D. F., Volker L. D. Cross-platform hyperparameter optimization for machine learning interatomic potentials. J. Chem. Phys.159 (2) (2023).

  14. Dhanka, S. & Maini, S. A hybridization of XGBoost machine learning model by Optuna hyperparameter tuning suite for cardiovascular disease classification with significant effect of outliers and heterogeneous training datasets. Int. J. Cardiol. 420, 132757 (2025).

    Google Scholar 

  15. Ahmed, W. et al. Hyperparameter optimization of machine learning models using grid search for Twitter sentiment analysis. In Humanizing Technology with Emotional Intelligence 419–432 (IGI Global Scientific Publishing, 2025).

    Google Scholar 

  16. Salehin, I. et al. AutoML: A systematic review on automated machine learning with neural architecture search. J. Inf. Intell. 2(1), 52–81 (2024).

    Google Scholar 

  17. Zhao, Y., Zhang, W. & Liu, X. Grid search with a weighted error function: Hyper-parameter optimization for financial time series forecasting. Appl. Soft Comput. 154, 111362 (2024).

    Google Scholar 

  18. Rimal, Y., Sharma, N. & Alsadoon, A. The accuracy of machine learning models relies on hyperparameter tuning: Student result classification using random forest, randomized search, grid search, Bayesian, genetic, and Optuna algorithms. Multimed. Tools Appl. 83(30), 74349–74364 (2024).

    Google Scholar 

  19. Albahli, S. Efficient hyperparameter tuning for predicting student performance with Bayesian optimization. Multimed. Tools Appl. 83(17), 52711–52735 (2024).

    Google Scholar 

  20. Mumtahina, U., Alahakoon, S. & Wolfs, P. Hyperparameter tuning of load-forecasting models using metaheuristic optimization algorithms—A systematic review. Mathematics 12(21), 3353 (2024).

    Google Scholar 

  21. Wang, Y., et al. A hyperparameter optimization method based on statistical orthogonal design for neural network models. International Conference on Pattern Recognition. (Springer, Cham, 2025).

  22. Joy, G., Huyck, C. & Yang, X.-S. Parameter tuning of the firefly algorithm by three tuning methods: Standard Monte Carlo, quasi-Monte Carlo and Latin hypercube sampling methods. J. Comput. Sci. https://doi.org/10.1016/j.jocs.2025.102588 (2025).

    Google Scholar 

  23. Prasad, K. S. et al. Augmenting cybersecurity through attention based stacked autoencoder with optimization algorithm for detection and mitigation of attacks on IoT assisted networks. Sci. Rep. 14(1), 30833 (2024).

    Google Scholar 

  24. Rodrigues, T. S., Placido R. P. Hyperparameter optimization in generative adversarial networks (GANs) using Gaussian AHP. IEEE Access (2024).

  25. Malakouti, S. M. Babysitting hyperparameter optimization and 10-fold-cross-validation to enhance the performance of ML methods in predicting wind speed and energy generation. Intell. Syst. Appl. 19, 200248 (2023).

    Google Scholar 

  26. Yazici, I., Emre, G. NR-V2X quality of service prediction through machine learning with nested cross-validation scheme. 2024 6th International Conference on Communications, Signal Processing, and their Applications (ICCSPA). IEEE, (2024).

  27. Sonkavde, G. et al. Forecasting stock market prices using machine learning and deep learning models: A systematic review, performance analysis and discussion of implications. Int. J. Financ. Stud. 11(3), 94 (2023).

    Google Scholar 

  28. Syu, J.-H. et al. A comprehensive survey on artificial intelligence empowered edge computing on consumer electronics. IEEE Trans. Consum. Electron. https://doi.org/10.1109/tce.2023.3318150 (2023).

    Google Scholar 

  29. Mabuni, D. & Babu, S. A. High accurate and a variant of k-fold cross validation technique for predicting the decision tree classifier accuracy. Int. J. Innovat. Technol. Explor. Eng. 10(2), 105–110. https://doi.org/10.35940/ijitee.C8403.0110321 (2021).

    Google Scholar 

  30. Machiraju, J., Rao, S. N. Effect of K-fold cross validation on Mri brain images using support vector machine algorithm. Int. J. Recent Technol. Eng. 2021.

  31. Ming, J. L. K. et al. Artificial neural network topology optimization using K-fold cross validation for spray drying of coconut milk. IOP Conf. Series: Mater. Sci. Eng. 778(1), 012094. https://doi.org/10.1088/1757-899X/778/1/012094 (2020).

    Google Scholar 

  32. Li, H. et al. Differential evolution particle swarm optimization algorithm based on good point set for computing Nash equilibrium of finite noncooperative game. AIMS Math. 6(2), 1309–1323. https://doi.org/10.3934/math.2021081 (2021).

    Google Scholar 

  33. Zhang, P. Research on convergence of artificial bee colony algorithm based on crossover and consistency distribution–good point set. (IOP Conference Series: Earth and Environmental Science IOP Publishing, 2020).

  34. Fang, K. T. & Wang, Y. A sequential algorithm for optimization and its applications to regression analysis. In LectureNotes in Contemporary Mathematics 17–28 (Science Press, 1990).

    Google Scholar 

  35. Yang, Y., Deng, K., Zhu, M. Multi-level training and Bayesian optimization for economical hyperparameter optimization. (2020).https://doi.org/10.48550/arXiv.2007.09953

  36. Nurkholis, A., Styawati, S. & Suhartanto, A. Firefly algorithm for SVM multi-class optimization on soybean land suitability analysis. JOIV Int. J. Informatics Vis. 8(2), 592–597 (2024).

    Google Scholar 

  37. Ghezelbash, R. et al. "Genetic algorithm to optimize the SVM and K-means algorithms for mapping of mineral prospectivity. Neural Comput. Appl. 35(1), 719–733 (2023).

    Google Scholar 

  38. Wang, Q. & Wang, X. Parameters optimization of the heating furnace control systems based on BP neural network improved by genetic algorithm. J. Internet Things 2(2), 75–80 (2020).

    Google Scholar 

  39. Sinha, A., Gunwal, S., Kumar, S. A globally convergent gradient-based Bilevel Hyperparameter optimization method. (2022). https://doi.org/10.48550/arXiv.2208.12118.

  40. Manorathna, R. k-fold cross-validation explained in plain English (For evaluating a model’s performance and hyperparameter tuning)[EB/OL].(2020)

  41. Yang, Z., Zhang, A. Hyperparameter optimization via sequential uniform designs. (2020).https://doi.org/10.48550/arXiv.2009.03586.

  42. Jamil, M. & Yang, X. S. A literature survey of benchmark functions for global optimization problems. Int. J. Math. Model. Numer. Optim. 4(2), 150–194. https://doi.org/10.1504/IJMMNO.2013.055204 (2013).

    Google Scholar 

  43. Yang, L. & Shami, A. On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415, 295–316. https://doi.org/10.1016/j.neucom.2020.07.061 (2020).

    Google Scholar 

  44. . Boulesteix, A. L., Bischl, B., Deng, D, et al. Hyperparameter optimization: foundations, algorithms, best practices and open challenges. (2021).

  45. Li, Y., Zhang, Y. & Cai, Y. A new hyper-parameter optimization method for power load forecast based on recurrent neural networks[J]. Algorithms 14(6), 163 (2021).

    Google Scholar 

  46. da Silva Santos, C. E., Sampaio, R. C., dos Santos, C. L., Bestard, G. A. & Llanos, C. H. Multi-objective adaptive differential evolution for SVM/SVR hyperparameters selection. Pattern Recog. https://doi.org/10.1016/j.patcog.2020.107649 (2020).

    Google Scholar 

  47. Yan, Y. Performance prediction by an SVM with a firefly optimization method. Int. J. Reliab. Qual. Saf. Eng. https://doi.org/10.1142/S0218539321500170 (2020).

    Google Scholar 

  48. Chang, B., Huang, J. Discrimination of molten pool penetration based on genetic algorithm optimization of BP neural network[C]//IOP Publishing Ltd. 012110 (IOP Publishing Ltd, 2020). https://doi.org/10.1088/1742-6596/1437/1/012110.

  49. Shan-Kun, N., Yu-Jia, W., Shanli, X., et al. A hybrid of particle swarm optimization and genetic algorithm for training back-propagation neural Network. (2023).

  50. Guo, B., Hu, J. & Wu, W. The Tabu_Genetic algorithm: a novel method for hyper-parameter optimization of learning algorithms. Electronics 8(5), 579 (2019).

    Google Scholar 

  51. Du, X., Xu, H. & Zhu, F. Understanding the effect of hyperparameter optimization on machine learning models for structure design problems. Comput. Aided Des. 135(5), 103013. https://doi.org/10.1016/j.cad.2021.103013 (2021).

    Google Scholar 

Download references

Funding

This work has been supported by the Universiti Kebangsaan Malaysia, under DIP 2024–033.

Author information

Authors and Affiliations

  1. Faculty of Information Science and Technology, Universiti Kebangsaan Malaysia, 43600, Bangi, Malaysia

    Zhang Bo, Mohammad Kamrul Hasan & Elankovan A. Sundararajan

  2. Institute of Computer Science and Digital Innovation, UCSI University Malaysia, Kuala Lumpur, Malaysia

    Shayla Islam

  3. Qingdao Institute of Software, College of Computer Science and Technology, China University of Petroleum (East China), Qingdao, 266580, China

    Peiying Zhang

  4. Shandong Key Laboratory of Intelligent Oil & Gas Industrial Software, Qingdao, 266580, China

    Peiying Zhang

  5. Computer Science Department, College of Computer Engineering and Science, Prince Sattam Bin Abdulaziz University, Al-Kharj, Saudi Arabia

    Fatima Rayan Awad Ahmed

  6. Bisha - Information System and Cybersecurity Department, College of Computing and Information Technologies, University of Bisha, P. O Box 551, 61922, Bisha, Kingdom of Saudi Arabia

    Zhang Bo & Nissrein Babiker Mohammed Babiker

Authors
  1. Zhang Bo
    View author publications

    Search author on:PubMed Google Scholar

  2. Mohammad Kamrul Hasan
    View author publications

    Search author on:PubMed Google Scholar

  3. Elankovan A. Sundararajan
    View author publications

    Search author on:PubMed Google Scholar

  4. Shayla Islam
    View author publications

    Search author on:PubMed Google Scholar

  5. Peiying Zhang
    View author publications

    Search author on:PubMed Google Scholar

  6. Fatima Rayan Awad Ahmed
    View author publications

    Search author on:PubMed Google Scholar

  7. Nissrein Babiker Mohammed Babiker
    View author publications

    Search author on:PubMed Google Scholar

Contributions

Zhang Bo: Conceptualize, methodology, software, Manuscript draft and review, validation, visualization; Mohammad Kamrul Hasan, Elankovan A. Sundararajan: Conceptualize, methodology, analysis, supervision, funding, review and editing; Shayla Islam, Peiying Zhang, Fatima Rayan Awad Ahmed, Nissrein Babiker Mohammed Babiker: Conceptualize, Data analysis, visualization, validation, and review and editing.

Corresponding authors

Correspondence to Mohammad Kamrul Hasan or Shayla Islam.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bo, Z., Hasan, M.K., Sundararajan, E.A. et al. A new good point set stepwise shrinkage optimization in machine learning model for fog node performance prediction. Sci Rep (2026). https://doi.org/10.1038/s41598-026-41630-z

Download citation

  • Received: 15 May 2025

  • Accepted: 22 February 2026

  • Published: 18 March 2026

  • DOI: https://doi.org/10.1038/s41598-026-41630-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Node performance prediction
  • Machine learning
  • Hyperparameter optimization
  • Good point set
  • Sequential experimental design
  • Consumer electronics
Download PDF

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • About Scientific Reports
  • Contact
  • Journal policies
  • Guide to referees
  • Calls for Papers
  • Editor's Choice
  • Journal highlights
  • Open Access Fees and Funding

Publish with us

  • For authors
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Reports (Sci Rep)

ISSN 2045-2322 (online)

nature.com footer links

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics