Table 11 Performance of NER using only BERT architecture

From: Multi-dimensional intelligent reorganization and utilization of knowledge in ‘Biographies of Chinese Thinkers’

Entity

Precision

Recall

F1

 

weighted avg

87.66%

91.08%

89.28%

BERT

weighted avg

87.62%

90.67%

89.08%

RoBERTa

weighted avg

87.71%

91.14%

89.34%

GujiBERT

weighted avg

87.68%

91.15%

89.35%

GujiRoBERTa

weighted avg

87.14%

91.01%

88.98%

SikuBERT

weighted avg

87.19%

90.90%

88.95%

SikuRoBERTa

micro avg

87.10%

91.08%

89.05%

BERT

micro avg

87.12%

90.67%

88.86%

RoBERTa

micro avg

87.15%

91.14%

89.10%

GujiBERT

micro avg

87.21%

91.15%

89.14%

GujiRoBERTa

micro avg

86.61%

91.01%

88.75%

SikuBERT

micro avg

86.56%

90.90%

88.68%

SikuRoBERTa

macro avg

79.79%

86.52%

82.92%

BERT

macro avg

80.09%

85.84%

82.79%

RoBERTa

macro avg

79.79%

86.41%

82.87%

GujiBERT

macro avg

80.06%

86.15%

82.92%

GujiRoBERTa

macro avg

79.56%

86.35%

82.73%

SikuBERT

macro avg

79.26%

86.34%

82.54%

SikuRoBERTa

  1. The NER performance table for the BERTology models demonstrates the entity recognition performance of the BERTology models alone after removing the BiLSTM-CRF structure.