Abstract
Study design:
Intra- and interrater reliability study for radiological variables of the International Spinal Cord Injury (SCI) Spinal Column Injury Basic Data Set.
Objectives:
To test reliability of the radiological variables in the International SCI Spinal Column Injury Basic Data Set and compare it with the Arbeitsgemeinschaft für Osteosynthesefragen (AO) classification.
Setting:
The database of Eastern Denmark Regional SCI Referral Center, Copenhagen, Denmark.
Methods:
Ratings of the International SCI Spinal Column Injury Basic Data Set radiological variables and AO classification were obtained by two international observers for all the surgically treated spine trauma patients between 1st October 2010 and 31st December 2012 at the Spine Unit, Rigshospitalet, Denmark. Statistical analyses for intra- and interrater crude agreement and Cohen’s unweighted kappa (κ) coefficients were performed.
Results:
For 283 spine injuries, the intra- and interrater reliability for the individual radiological variables of the International SCI Spinal Column Injury Basic Data Set was at least substantial (κ=0.67–0.97 for interrater, κ=0.79–0.89 for the intrarater agreement). For the AO classification, intrarater reliability was moderate-to-substantial (κ=0.57–0.75), whereas interrater reliability was substantial (κ=0.67–0.69). The crude intra- and interrater agreement for a combined radiographic SCI Spinal Column Injury Basic Data Set variable showed no significant difference compared with the AO classification (P=0.067–0.895).
Conclusions:
The reliability of International SCI Spinal Column Injury Basic Data Set radiological variables is comparable to the AO classification system. We encourage its use for spinal column injury description, thus facilitating data collection and comparison between centres and countries.
Similar content being viewed by others
Log in or create a free account to read this content
Gain free access to this article, as well as selected content from this journal and more on nature.com
or
References
Magerl F, Aebi M, Gertzbein SD, Harms J, Nazarian S . A comprehensive classification of thoracic and lumbar injuries. Eur Spine J 1994; 3: 184–201.
Blauth M, Kathrein A, Mair G, Schmid R, Reinhold M, Rieger M. Classification of injuries of the subaxial cervical spine. In: Aebi M, Arlet V, Webb JK (eds). AO Spine Manual: Clinical Applications vol. 2. Thieme: Stuttgart,. 2007, pp 21–38.
Vaccaro AR, Lehman RA Jr, Hurlbert RJ, Anderson PA, Harris M, Hedlund R et al. A new classification of thoracolumbar injuries: the importance of injury morphology, the integrity of the posterior ligamentous complex, and neurologic status. Spine (Phila Pa 1976) 2005; 30: 2325–2333.
Vaccaro AR, Oner C, Kepler CK, Dvorak M, Schnake K, Bellabarba C et al. AO Spine Spinal Cord Injury & Trauma Knowledge Forum. AO spine thoracolumbar spine injury classification system: fracture description, neurological status, and key modifiers. Spine (Phila Pa 1976) 2013; 38: 2028–2037.
Wood KB, Khanna G, Vaccaro AR, Arnold PM, Harris MB, Mehbod AA . Assessment of two thoracolumbar fracture classification systems as used by multiple surgeons. J Bone Joint Surg Am 2005; 87: 1423–1429.
Whang PG, Vaccaro AR, Poelstra KA, Patel AA, Anderson DG, Albert TJ et al. The influence of fracture mechanism and morphology on the reliability and validity of two novel thoracolumbar injury classification systems. Spine (Phila Pa 1976) 2007; 32: 791–795.
Dvorak MF, Wing PC, Fehlings MG, Vaccaro R, Itshayek E, Biering-Sorensen F et al. International spinal cord injury spinal column injury basic data set. Spinal Cord 2012; 50: 817–821.
Landis JR, Koch GG . The measurement of observer agreement for categorical data. Biometrics 1977; 33: 159–174.
Gamer M, Lemon J, Puspendra IF . irr: various coefficients of interrater reliability and agreement. Available at https://cran.r-project.org/web/packages/irr/index.html. Accessed 20 November 2013.
Warnes GR, Bolker B, Lumley T, Johnson RC . gmodels: various R programming tools for model fitting. Available at https://cran.r-project.org/web/packages/gmodels/index.html. Accessed 20 November 2013.
Core Team RR. A language and Environment for Statistical Computing. R Foundation for Statistical Computing: Vienna, Austria. 2014.
Kraemer HC, Periyakoil VS, Noda A . K statistics in medical research. Stat Med 2002; 21: 2109–2129.
Aebi M, Nazarian S . Classification of injuries of the cervical spine. Orthopaede 1987; 16: 27–36.
Oner FC, Ramos LM, Simmermacher RK, Kingma PT, Diekerhof CH, Dhert WJ et al. Classification of thoracic and lumbar spine fractures: problems of reproducibility. A study of 53 patients using CT and MRI. Eur Spine J 2002; 11: 235–245.
Lenarz CJ, Place HM, Lenke LG, Alander DH, Oliver D . Comparative reliability of 3 thoracolumbar fracture classification systems. J Spinal Disord Tech 2009; 22: 422–427.
Biering-Sørensen F, Alexander MS, Burns S, Charlifue S, DeVivo M, Dietz V et al. Recommendations for translation and reliability testing of international spinal cord injury data sets. Spinal Cord 2011; 49: 357–360.
Leucht P, Fischer K, Muhr G, Mueller EJ . Epidemiology of traumatic spine fractures. Injury 2009; 40: 166–172.
Sim J, Wright CC . The kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys Ther 2005; 85: 257–268.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare no conflict of interest.
Rights and permissions
About this article
Cite this article
Lucantoni, C., Krishnan, R., Gehrchen, M. et al. Reliability of the radiographic variables in the International Spinal Cord Injury Spinal Column Injury Basic Data Set compared with the AO classification. Spinal Cord 54, 884–888 (2016). https://doi.org/10.1038/sc.2016.6
Received:
Revised:
Accepted:
Published:
Issue date:
DOI: https://doi.org/10.1038/sc.2016.6

