Table 13 Experimental results of Chinese literary entity identification (%)
From: HsscBERT: pre-training domain model for the full text of Chinese humanity and social science
Precision (%) | Recall (%) | F1 (%) | |
|---|---|---|---|
BERT-base-Chinese | 69.51 | 72.53 | 70.99 |
Chinese-RoBERTa-wwm-ext | 48.37 | 52.95 | 50.56 |
HsscBERT_e3 | 71.31 | 73.45 | 72.36 |
HsscBERT_e5 | 71.17 | 73.64 | 72.38 |
LLAMA3.1-8B | 67.32 | 64.15 | 65.70 |
GPT3.5-turbo | 43.82 | 15.43 | 18.20 |
GPT4-turbo | 60.46 | 32.87 | 42.59 |