Table 2 Classification of neural networks by network structure.
Typology | Applicable Scenarios | Specificities | Subcategories |
|---|---|---|---|
FNN | Simple classification and regression tasks. | Information flows unidirectionally from the input layer to the output layer without feedback connections | MLP, Autoencoder |
CNN | Image classification, object detection, segmentation, etc. | Convolutional operations extract local features, making them suitable for image data processing | VGG, TCN |
RNN | Natural language processing and time series forecasting. | Utilizes time-memory features for processing time-series data | LSTM, GRU |
GAN | Image generation and style transfer. | Composed of a generator and a discriminator, which generate data through adversarial training | cGAN, WGAN |
GNN | Graph classification, node prediction, and edge prediction. | Designed for processing graph-structured data, such as social networks and molecular structures | GCN, GAT |
Attention Mechanism | Natural language processing and image-related tasks. | Selects more important input information using attentional mechanisms | Transformer, Self-Attention Network |