Table 2 Comparison of modern transformer models for zero-shot department classification.
Model | Accuracy (%) | Latency (ms) | GPU required |
|---|---|---|---|
BERT-base (fine-tuned) | 94.1 | 128 | Yes |
RoBERTa-base (fine-tuned) | 94.8 | 141 | Yes |
DeBERTa-v3-base (fine-tuned) | 95.2 | 158 | Yes |
LLaMA-3 8B (zero-shot) | 93.7 | 420 | Yes |
MobileBERT (zero-shot) | 92.4 | 19 | No |