Interobserver and intraobserver reliability study of improved method to evaluate radiographs of patients with scoliosis.Objective.
To determine the reliability of a computer-assisted measurement protocol for evaluating Cobb angle and King et al classification.Summary of Background Data.
Evaluation of scoliosis radiographs is inherently unreliable because of technical and human judgmental errors. Objective, computer-assisted evaluation tools may improve reliability.Methods.
Posteroanterior preoperative radiographic images of 27 patients with adolescent idiopathic scoliosis were each displayed on a computer screen. They were marked 3 times in random sequence by each of 5 evaluators (observers) who marked 70 standardized points on the vertebrae and sacrum in each radiograph. A computer program (Spine 2002;27:2801–5) that identified curves, calculated Cobb angles, and generated the King et al classification automatically analyzed coordinates of these points. The interobserver and intraobserver variability of the Cobb angle and King et al classification evaluations were quantified and compared with values obtained by unassisted observers.Results.
Average Cobb angle intraobserver standard deviation was 2.0° for both the thoracic and lumbar curves (range 0.1 to 8.3° for different curves). Interobserver reliability was 2.5° for thoracic curves and 2.6° for lumbar curves. Among the 5 observers, there was an inverse relationship between repeatability and time spent marking images, and no correlation with image quality or curve magnitude. Kappa values for the variability of the King et al classification averaged 0.85 (intraobserver).Conclusions.
Variability of Cobb measurements compares favorably with previously published series. The classification was more reliable than achieved by unassisted observers evaluating the same radiographs. The same principles may be applicable to other radiographic measurement and evaluation procedures.