Table 10 Expert evaluation Inter-rater reliability metrics.
Reliability metric | Value | 95% CI | Interpretation |
|---|---|---|---|
Intraclass correlation coefficient (ICC) | 0.87 | [0.82, 0.91] | Excellent reliability |
Kendall’s W (Coefficient of concordance) | 0.81 | [0.76, 0.86] | Substantial agreement |
Fleiss’ Kappa (Multi-rater) | 0.78 | [0.73, 0.83] | Substantial agreement |
Cronbach’s Alpha (Internal consistency) | 0.89 | [0.85, 0.92] | Good internal consistency |
Mean absolute deviation from group mean | 0.62 | [0.55, 0.69] | Low individual variation |