Figure 2

The architecture of FC-CNN network. ReLU is the activation layer. BN is a batch normalization layer. DP is the dropout layer. FC is the full connected layer. The input vector possessed N features. Nn is the neutron number of every FC layer, which could be different. Softmax is the special activation function for s classification network.