Table 2 The utilized parameters in FriendPat-based EEG signal classification model.

From: An explainable EEG epilepsy detection model using friend pattern

Phase

Parameters

Feature extraction

The main feature extraction function is FriendPat and the FriendPat

feature extractor is a distance-based feature extractor

A 35 channeled EEG dataset was used. Therefore, the presented

FriendPat extracts 595 features from each EEG signal

Feature selection

CWINCA is utilized to feature selection. This feature selector is an iterative

and self-organizing feature selector. The paramaters of the utilized CWINCA

are given as follows. Herein, the threshold points were chosen as 0.5 and

0.9999. By utilizing these values, the start index was detected as 5 and

the stop index was computed as 120. In this aspect, 116 feature vectors

selected were generated. The optimal length of the selected features

was computed as 82

Classification

In the classification phase, tkNN has been utilized. For the used tkNN, the

iteratively changed parameters are k values (from 1 to 5), distances (Manhattan,

Cosine, Euclidean) and weights (Inverse, Equal). In this aspect, 30 (= 5 × 3 × 2)

parameters-wise outcomes have been generated. By deploying iterative

majority voting, 28 voted outcomes were created since the range of the

iteration of the IMV is from 3 to 30 and the used voting function is the

mode function. In this iteration, we have sorted the parameters-based

outcomes according to their classification accuracy. In the greedy

algorithm, the best outcome (the outcome with maximum

classification accuracy) was selected automatically

XAI

DLob XAI generator is used in this phase. In this research, we use 13

DLob symbols to code channels of the used EEG signal dataset and the

used DLob symbols are FL, FR, Fz, TL, TR, PL, PR, Pz, OL, OR, CL, CR

and Cz. Moreover, cortical connectome diagram and information entropy of

the created DLob string have been computed. Due to 13 DLob symbols

have been used in this research, the maximum information entropy is

equal to 3.7004 (= log213). To compute the complexity

ratio of the generated DLob symbol, the information entropy of the

created DLob sentence is divided by the maximum entropy