Table 2 Impact of network depth on MAE

From: DenseGNN: universal and scalable deeper graph neural networks for high-performance property prediction in crystals and molecules

Models

Log kvrh

GCL

% improve

Phonons

GCL

% improve

DenseGNN

0.0512

5

-

24.8470

5

-

Model1

0.0552

5

−7.81

26.9417

5

−8.43

Model2

0.0632

5

−23.44

32.1046

5

−29.21

Model3

0.0548

5

−7.03

29.2872

5

−17.87

Model4

0.0571

5

−11.52

30.4843

5

−22.69

DeeperDenseGNN

0.0490

15

4.30

23.1206

15

6.95

  1. This table evaluates the impact of network depth on the MAE for DeepGNN, SchNet, HamNet, and GIN models, fusing DCN and LOPE strategies across six Matbench datasets. All models can scale to at least 30 GC layers. The best results are highlighted in bold. The shaded area specifically showcases the MAE results of models with over 30 GC layers. An asterisk (*) denotes training failure due to insufficient computational resources.