Table 4 Hyperparameters for transformer Models.

From: Dataset creation and benchmarking for Kashmiri news snippet classification using fine-tuned transformer and LLM models in a low resource setting

Hyperparameter/setting

Value/description

Tokenizer

AutoTokenizer with max length of 128 tokens

Learning rate

2e-5

Batch size

16

Epochs

10

Evaluation metric

Accuracy, F1-Score

Loss function

Cross-entropy (used by trainer)

Optimizer

AdamW (default in trainer)