Fig. 2: Comparison of effects on whether applying CrystalTransformer-generated universal atomic embeddings (ct-UAE) across different models and the distribution for the entire dataset. | Nature Communications

Fig. 2: Comparison of effects on whether applying CrystalTransformer-generated universal atomic embeddings (ct-UAE) across different models and the distribution for the entire dataset.

From: Transformer-generated atomic embeddings to enhance prediction accuracy of crystal properties with machine learning

Fig. 2

Ef is the formation energy. MAE refers to the Mean Absolute Error. R2 is the R-squared value in predicting each property. None means trained from scratch with no front-end model, and CT indicates CrystalTransformer or ct-UAE. a–c Plots of predicted formation energy versus target formation energy for CGCNN9, MEGNET10, and ALIGNN11 models on the MP dataset. The upper part and the right part denotes target and prediction data distribution respectively. The MAE and R2 for None-CGCNN are 0.083 eV/atom and 0.982, respectively, while for CT-CGCNN, the MAE and R2 are 0.073 eV/atom and 0.986 respectively. The MAE and R2 for None-MEGNET are 0.051 eV/atom and 0.994, respectively, while for CT-MEGNET, the MAE and R2 are 0.049 eV/atom and 0.994 respectively. The MAE and R2 for None-ALIGNN are 0.022 eV/atom and 0.997, respectively, while for CT-CGCNN, the MAE and R2 are 0.018 eV/atom and 0.997 respectively. d The distribution curve for the formation energy across the entire dataset. Source data are provided as a Source Data file.

Back to article page