Table 1 Computational complexity

From: A Euclidean transformer for fast and stable machine learned force fields

Architecture

Scaling

\({l}_{\max }\)

SchNet9

\({\mathcal{O}}(n\times \langle {\mathcal{N}}\rangle \times 1\times F)\)

0

PaiNN37

\({\mathcal{O}}(n\times \langle {\mathcal{N}}\rangle \times 4\times F)\)

1

SpookyNet24

\({\mathcal{O}}(n\times \langle {\mathcal{N}}\rangle \times {({l}_{\max }+1)}^{2}\times F)\)

2

NequIP36

\({\mathcal{O}}(n\times \langle {\mathcal{N}}\rangle \times {({l}_{\max }+1)}^{6}\times F)\)

3

SO3krates

\({\mathcal{O}}(n\times \langle {\mathcal{N}}\rangle \times ({({l}_{\max }+1)}^{2}+F))\)

3

  1. Scaling for different message passing architectures, where n is the number of atoms, \(\langle {\mathcal{N}}\rangle\) the average number of neighbors, \({l}_{\max }\) the maximal degree and F the feature dimension. SchNet and PaiNN have fixed maximal degree of \({l}_{\max }=0\) and \({l}_{\max }=1\) whereas they are free parameter in other models.