Comparison of Iodine Density Measurement Among Dual-Energy Computed Tomography Scanners From 3 Vendors

    loading  Checking for direct PDF access through Ovid

Abstract

Objectives

The aims of this study were to analyze the effect of dual-energy computed tomography (DECT) scanners and fluid characteristics on iodine quantification and to calculate the measurement variability range induced by those variables.

Methods

We performed an experimental phantom study with 4 mediastinal iodine phantoms. Each phantom contained 6 tubes of different iodine concentrations (0, 1.0, 2.5, 5.0, 10.0, and 20.0 mg/mL) diluted in a specific solvent, which was water, 10% amino acid solution, 20% lipid emulsion, or 18% calcium solution, respectively. Mediastinal phantoms were inserted into an anthropomorphic chest phantom and were scanned with 3 different DECT scanners from 3 vendors using 2 radiation dosage settings. Iodine density (IoD) and computed tomography (CT) attenuation at virtual monoenergetic 70-keV images and virtual nonenhanced images were measured for the iodine phantoms. The effects of DECT scanners, solvents, and radiation dosage on the absolute measurement error of IoD and on the CT attenuation profiles were investigated using linear mixed-effects models. Measurement variability range of IoD was also determined.

Results

Absolute error of IoD was not significantly affected by the DECT systems, kind of solvents, and radiation dosage settings. However, CT attenuation profiles were significantly different among the DECT vendors and simulated body fluids. Measurement variability range of IoD was from −0.6 to 0.4 mg/mL for the true iodine concentration 0 mg/mL.

Conclusions

Dual-energy CT systems and fluid characteristics did not have a significant effect on the IoD measurement accuracy. A cutoff of IoD for the determination of a truly enhancing lesion on DECT would be 0.4 mg/mL.

Related Topics

    loading  Loading Related Articles