Figure 6 | Scientific Reports

Figure 6

From: Neuron type classification in rat brain based on integrative convolutional and tree-based recurrent neural networks

Figure 6

The proposed integrated DNN architecture contains a ResNet CNN for 2D-image feature detection and Tree-RNN for SWC-format feature detection. (a) Diagram of the processing procedure, including the data preparation, feature learning, neuron classification, and categorization. (b) The structure of ResNet18 for image classification. The input is three-channel images from the projections of the X, Y, and Z axes. Before the residual blocks, data are first processed by convolution, batch normalization (BN), ReLU activation, and MaxPooling. The following four layers have different residual block hyperparameters, as shown in Table 2. The output is the generated features after completing average-pooling and full-connection. (c) The proposed Tree-RNN, including the standard structures of the RNN and LSTM modules. There are five 2-layer LSTM blocks in this tree, and black arrows represent the connections between them. Every block has the same structure, which contains two hidden layers, each with 128 neurons. Finally, the result is output through a fully connected layer. The dotted boxes show the basic operating units of the RNN and LSTM. (d) The submodule of the ResNet layer. (e) The submodule of the Tree-RNN, which can also be considered the traditional simple 2-layer RNN.

Back to article page