Table 6 Summary of strengths and weaknesses of employed explanatory techniques.

From: Soft-computing models for predicting plastic viscosity and interface yield stress of fresh concrete

Technique

Benefits

Limitations

Reference

Shapely Additive Explanatory Analysis (SHAP)

• A unified measure of input feature importance across models

• Gives interpretations at both local and global level

• Easy to implement

• Provides feature importance in the form of additive feature attributions

• Results can be easily represented using a variety of plots

• Different plots convey different information which can lead to confusion

117

Individual Conditional Expectation (ICE)

• Very useful tool to check global interpretability of models

• Depicts detailed view of feature variation on the predicted output

• Provides the relationship between different inputs and outcome in the form of a curve

• Useful for capturing heterogeneity in feature effects

• Easy to interpret and implement

• Can be visually challenging to interpret for complex feature relations

• Provides only visual analysis

118