Table 3 Numerical evaluation of the suggested movement recognition model over numerous techniques.

From: Development of weighted residual RNN model with hybrid heuristic algorithm for movement recognition framework in ambient assisted living

Terms

MLSTM33

BiLSTM36

DBN43

WRRNN

HRS-COA-WRRNN

Linear activation function

 Accuracy

88.65201

89.53683

90.59341

92.29874

94.14164

 Sensitivity

88.93773

89.64591

90.52503

92.21001

94.06593

 Specificity

88.63161

89.52904

90.59829

92.30508

94.14704

 Precision

35.84822

37.94707

40.7497

46.11908

53.44433

 FPR

11.36839

10.47096

9.401709

7.694924

5.852957

 FNR

11.06227

10.35409

9.474969

7.789988

5.934066

 NPV

88.63161

89.52904

90.59829

92.30508

94.14704

 FDR

64.15178

62.05293

59.2503

53.88092

46.55567

 F1-score

51.09965

53.32268

56.20073

61.48585

68.16191

 MCC

0.520786

0.54218

0.569702

0.620251

0.683736

ReLU activation function

 Accuracy

88.6936

89.64983

90.46561

92.25782

94.18278

 Sensitivity

88.67244

89.52381

90.56277

92.03463

94.35786

 Specificity

88.69511

89.65883

90.45867

92.27376

94.17027

 Precision

35.90837

38.20903

40.4043

45.97088

53.62034

 FPR

11.30489

10.34117

9.541332

7.726242

5.829726

 FNR

11.32756

10.47619

9.437229

7.965368

5.642136

 NPV

88.69511

89.65883

90.45867

92.27376

94.17027

 FDR

64.09163

61.79097

59.5957

54.02912

46.37966

 F1-score

51.11675

53.55894

55.87856

61.31513

68.3817

 MCC

0.520404

0.544054

0.566894

0.61839

0.686235

Tanh activation function

 Accuracy

88.70194

89.60494

90.49735

92.28689

94.24456

 Sensitivity

88.4127

89.29453

90.40564

92.16931

94.25044

 Specificity

88.7226

89.62711

90.50391

92.29529

94.24414

 Precision

35.89689

38.07626

40.47694

46.07653

53.90901

 FPR

11.2774

10.37289

9.496095

7.704712

5.755858

 FNR

11.5873

10.70547

9.594356

7.830688

5.749559

 NPV

88.7226

89.62711

90.50391

92.29529

94.24414

 FDR

64.10311

61.92374

59.52306

53.92347

46.09099

 F1-score

51.06188

53.38746

55.91797

61.43898

68.58756

Sigmoid activation function

 Accuracy

88.6576

89.51776

90.56085

92.42479

94.14059

 Sensitivity

89.04762

89.59184

90.63492

92.4263

94.30839

 Specificity

88.62974

89.51247

90.55556

92.42468

94.1286

 Precision

35.87284

37.89565

40.66952

46.56689

53.43011

 FPR

11.37026

10.48753

9.444444

7.575316

5.871396

 FNR

10.95238

10.40816

9.365079

7.573696

5.69161

 NPV

88.62974

89.51247

90.55556

92.42468

94.1286

 FDR

64.12716

62.10435

59.33048

53.43311

46.56989

 F1-score

51.1428

53.26233

56.14553

61.93117

68.21388

 MCC

0.521397

0.54153

0.569424

0.62465

0.684629

Softmax activation function

 Accuracy

88.68571

89.67831

90.53968

92.35344

94.21799

 Sensitivity

88.79365

89.61905

90.28571

92.4127

93.65079

 Specificity

88.678

89.68254

90.55782

92.34921

94.2585

 Precision

35.90501

38.28835

40.58219

46.31663

53.81248

 FPR

11.322

10.31746

9.442177

7.650794

5.741497

 FNR

11.20635

10.38095

9.714286

7.587302

6.349206

 NPV

88.678

89.68254

90.55782

92.34921

94.2585

 FDR

64.09499

61.71165

59.41781

53.68337

46.18752

 F1-score

51.13346

53.6539

55.99527

61.70641

68.35032