Table 3 ElemNet Architecture.

From: ElemNet: Deep Learning the Chemistry of Materials From Only Elemental Composition

Layer Types

No. of units

Activation

Layer Positions

Fully-connected Layer

1024

ReLU

First to 4th

Drop-out (0.8)

1024

 

After 4th

Fully-connected Layer

512

ReLU

5th to 7th

Drop-out (0.9)

512

 

After 7th

Fully-connected Layer

256

ReLU

8th to 10th

Drop-out (0.7)

256

 

After 10th

Fully-connected Layer

128

ReLU

11th to 13th

Drop-out (0.8)

128

 

After 13th

Fully-connected Layer

64

ReLU

14th to 15th

Fully-connected Layer

32

ReLU

16th

Fully-connected Layer

1

Linear

17th

  1. Considering the Input as the 0th layer, types and positions of different types of fully connected and dropouts are shown below. Dropout layers are used to prevent overfitting and they are not counted as a separate layer. We used ReLU as the activation function.