Table 4 The degree of correlation between the variable nodes in the Bayes structure.

From: Bayesian neural network modelling for estimating ecological footprints and blue economy sustainability across G20 nations

X

Y

Mutual Information

Symmetric

For X Given Y

For Y Given X

Entropy X

Entropy Y

(POT)

(GHG)

0.6192

0.4351

0.4067

0.4677

1.5224

1.3238

(GDP)

(BE)

0.2833

0.1238

0.1392

0.1115

2.0352

2.5402

(POT)

(EF)

0.2197

0.1849

0.1443

0.2575

1.5224

0.8531

(GHG)

(GDP)

0.1322

0.0787

0.0998

0.0649

1.3238

2.0352

(GHG)

(BE)

0.0918

0.0475

0.0694

0.0362

1.3238

2.5402

(POT)

(GDP)

0.0841

0.0473

0.0552

0.0413

1.5224

2.0352

(POT)

(BE)

0.0794

0.0391

0.0522

0.0313

1.5224

2.5402

(EF)

(GDP)

0.0211

–0.0146

–0.0247

–0.0104

–0.8531

–2.0352

(GHG)

(EF)

0.0196

0.0180

0.0148

0.0230

1.3238

0.8531

(EF)

(BE)

0.0005

–0.0003

–0.0006

–0.0002

–0.8531

–2.5402

  1. It should be emphasized that the symmetric scores and mutual information show the combined distribution of the X and Y variables. A higher score indicates a significant relationship between the variables. Scores represent the conditional probability distribution of random variables for X given Y and Y given X. The entropy score of the Bayes network consistently reflects the information loss of random variables. The ultimate deterministic processing of the network invariably yields a reduced entropy score.