Analysis of the Accuracy and Performance of a Continuous Glucose Monitoring Sensor Prototype: An In-Silico Study Using the UVA/PADOVA Type 1 Diabetes Simulator

    loading  Checking for direct PDF access through Ovid

Abstract

Background:

Computer simulation has been shown over the past decade to be a powerful tool to study the impact of medical devices characteristics on clinical outcomes. Specifically, in type 1 diabetes (T1D), computer simulation platforms have all but replaced preclinical studies and are commonly used to study the impact of measurement errors on glycemia.

Method:

We use complex mathematical models to represent the characteristics of 3 continuous glucose monitoring systems using previously acquired data. Leveraging these models within the framework of the UVa/Padova T1D simulator, we study the impact of CGM errors in 6 simulation scenarios designed to generate a wide variety of glycemic conditions. Assessment of the simulated accuracy of each different CGM systems is performed using mean absolute relative deviation (MARD) and precision absolute relative deviation (PARD). We also quantify the capacity of each system to detect hypoglycemic events.

Results:

The simulated Roche CGM sensor prototype (RCGM) outperformed the 2 alternate systems (CGM-1 & CGM-2) in accuracy (MARD = 8% vs 11.4% vs 18%) and precision (PARD = 6.4% vs 9.4% vs 14.1%). These results held for all studied glucose and rate of change ranges. Moreover, it detected more than 90% of hypoglycemia, with a mean time lag less than 4 minutes (CGM-1: 86%/15 min, CGM-2: 57%/24 min).

Conclusion:

The RCGM system model led to strong performances in these simulation studies, with higher accuracy and precision than alternate systems. Its characteristics placed it firmly as a strong candidate for CGM based therapy, and should be confirmed in large clinical studies.

Related Topics

    loading  Loading Related Articles