Table 1 Introduction to experimental model.
From: A multi agent classical Chinese translation method based on large language models
Model Name | Model Introduction |
|---|---|
DeepSeek V323 | An open-source Mixture of Experts (MoE) model developed by the DeepSeek team, featuring a large parameter scale and powerful language generation and understanding capabilities. |
Qwen Plus | An efficient version of the Qwen series, suitable for moderately complex tasks. |
Qwen Turbo | A Chinese LLM developed by Alibaba DAMO Academy, suitable for simple tasks. |
GLM Air | A high-performance model based on the GLM architecture, optimized for Chinese tasks. |
GLM Flash | A lightweight version of GLM, balancing inference speed and resource efficiency. |
GPT-4o Mini | A streamlined version of GPT-4o introduced by OpenAI, suitable for resource-constrained scenarios. |