Table 2 Main neural network architectures in the HFC-AES model.

From: Intelligent text analysis for effective evaluation of english Language teaching based on deep learning

Model architecture

Core function

Advantage

Application locations in this study

Convolutional Neural Network (CNN)

Extract local semantic features of text

It is good at capturing local patterns, with relatively few parameters and strong stability.

Local semantic modeling in shallow and deep text feature extraction

Long Short-Term Memory (LSTM)

Capture long-distance dependencies and global semantic relationships of text

It solves the traditional RNN gradient vanishing and is suitable for long text sequence processing

Deep feature extraction strengthens the semantic coherence between sentence sequences

Hierarchical neural network

Hierarchical modeling of the local and global structure of text

It preserves text hierarchies and enhances topic-related semantic understanding

In the topic-related feature extraction stage, the relationship between the topic and the topic of the composition is processed.

Attention mechanism

Dynamically adjust the weights of different features to focus on key semantic information

It enhances the model’s ability to identify important information and improves task adaptability

The theme-related stage supports the multi-task feature fusion of the cross-attention mechanism.