Table 1 Optimizing experimental outcomes.
From: Cattle identification based on multiple feature decision layer fusion
System specifications | The operating system is Ubuntu 22.04.2, the CPU is Intel(R) Core(TM) i7-7800X, the running memory is 64GB, the primary frequency is 3.50Ā GHz, and the GPUs are 2 NVIDIA GeForce RTX 2080 Ti. | ||
|---|---|---|---|
experimental environment | CUDA version: 11.3 Python: 3.8.10 PyTorch:1.12.0 | ||
Parameter tuning methods | Grid search and cross-validation techniques were used to tune model parameters. | ||
Model optimal hyperparameters. (Highlight the best indicators in bold) | |||
Decision Trees(DT) | āmax_depthā: [None, 10, 20, 30] āmin_samples_splitā: [2, 5, 10] āmin_samples_leafā: [1, 2, 4] | Bagging | ān_estimatorsā: [2,10,15,20] āmax_samplesā: [0.5, 1.0] ābootstrapā: [True, False] |
Logistic regression (Lr) | āpenaltyā: [āl1ā, āl2ā] āCā: [0.01, 0.1, 1, 10] āsolverā: [āliblinearā, ālbfgsā,āsagaā] | Gradient_Boosting_Classifier | ān_estimatorsā:[50,100, 150] ālearning_rateā:[0.01,0.1,0.2] āmax_depthā: [3, 5,7] |
Gaussian_NB (GS) | āvar_smoothingā: [1e-9, 1e-8, 1e-7, 1e-6] | LightGBM | ān_estimatorsā:[50,100, 150] ālearning_rateā:[0.01,0.1,0.2] āmax_depthā: [3, 5,7] |
Random_Forests(RF) | ān_estimatorsā:[50, 100, 150, 200] ācriterionā: [āginiā, āentropyā] āmax_depthā: [None, 10, 20, 30] āmin_samples_splitā: [2, 5, 10] āmin_samples_leafā: [1, 2, 4] āmax_featuresā:[None,āsqrtā,ālog2ā] | XGboost | ān_estimatorsā: [50,100, 150] ālearning_rateā:[0.01,0.1,0.2] āmax_depthā: [3, 5,7] |
Voting Classifier | āvotingā: [āhardā, āsoftā] āweightsā: {āDTā:2, āLGā:4, āGSā: 1, āRFā: 3} | Stacking | ācvā: [5, 10] āstack_methodā: āautoā |