Extended Data Table 1 Statistics of the usage of different sizes of pre-trained models

From: Parameter-efficient fine-tuning of large-scale pre-trained language models

  1. The usage of models of different sizes in research published in NLP conferences, the statistic is based on 1000 randomly selected papers. Large PLMs are defined as PLMs with over 1 billion parameters.