Table 2 Hyperparameters of classifier.
Methods | Parameters |
|---|---|
CMN-ShuffleNet model | Number of input layer:1 convolution layer1, filter size:24 Activation:relu Output layer1 Optimizer: adam Activation function: softmax Batch size:64 Loss = ‘categorical_crossentropy’ Epoch = 50 Metrics = [‘accuracy’] |
GRU | batch_size = 128 epochs = 50 input layer = 1 Bidirectional GRU uint1 = 128, activation = ‘softmax’, Dropout = 0.2 loss = ‘categorical_crossentropy’ |
LeNet | Conv2D—> (16, (5, 5), activation = ‘relu’, input_shape = input_shape_)) MaxPooling2D((2, 2))) Conv2D(32, (5, 5), activation = ‘relu’)) MaxPooling2D((2, 2))) Flatten Dense layer (120, activation = ‘relu’) Dense layer (84, activation = ‘relu’ |
Bi-LSTM | input layer—> 1 Bidirectional LSTM layer1 :- > lstm unit:128 activation = ‘relu’, return_sequences = True Dropout = 0.5 batch_size = 28 |
ShuffleNet | Number of input layer:1 convolution layer1, filter size:24 kernel_size = 3, strides = 2, padding = ‘same’, use_bias = False Activation:relu |
AlexNet-SVM18 | input layer:1 Conv2D(filters = 96, kernel_size = (3, 3), strides = (1, 1), activation = ‘relu’, ), BatchNormalization(), MaxPool2D(pool_size = (3, 3), strides = (2, 2)), |
ATT-DenseNet16 | convolution 1 maxpooling 1 dense block1 transition layer1 SE block dense block2 transition layer2 |