Fig. 7: Workflow of the t-th graph convolution layer in DefiNet. | npj Computational Materials

Fig. 7: Workflow of the t-th graph convolution layer in DefiNet.

From: Modeling crystal defects using defect informed neural networks

Fig. 7: Workflow of the t-th graph convolution layer in DefiNet.

The process begins with message distribution, where the global scalar \({{\boldsymbol{x}}}_{{\mathcal{G}}}^{(t)}\) and global vector \({\vec{\boldsymbol{x}}}_{{\mathcal{G}}}^{(t)}\) are globally distributed to each scalar \({{\boldsymbol{x}}}_{i}^{(t)}\) and vector \({\vec{\boldsymbol{x}}}_{i}^{(t)}\). This is followed by defect-aware message passing, which locally collects messages from neighboring nodes vj, weighting messages according to interatomic distances and the defect markers mi and mj. Next, message updating refines the node representation using the information within the node itself, resulting in \({{\boldsymbol{x}}}_{i}^{(t+1)}\) and \({\vec{{\boldsymbol{x}}}}_{i}^{(t+1)}\). Coordinate updating then further refines the atomic coordinates, resulting in the updated coordinates \({\overrightarrow{{\boldsymbol{r}}}}_{i}^{(t+1)}\). Finally, message aggregation is performed to update the global scalar and vector, resulting in \({{\boldsymbol{x}}}_{{\mathcal{G}}}^{(t+1)}\) and \({\vec{{\boldsymbol{x}}}}_{{\mathcal{G}}}^{(t+1)}\).

Back to article page