The Emerging Role of Digital Technologies in Early Clinical Development

    loading  Checking for direct PDF access through Ovid

Excerpt

The pharmaceutical industry has seen advances in the use of digital technologies in drug development. Some have used digital technologies to detect changes in biomarkers that might otherwise not be seen with conventional approaches, e.g., the use of gaming for cognitive measurement.1 This cognitive assessment tool was apparently designed to operate as a fast‐paced action video game on a tablet. Others have used wearable sensors to improve patient engagement in clinical trials such as the use of iPhone sensors for data capture in rheumatoid arthritis.2 A more proximal use of these technologies in early clinical research is in using sensors for a study in Parkinson's disease (PD).3 The investigators hope to have these sensors as an alternative to using conventional tools such as the Unified Parkinson's Disease Rating Scale (UPDRS). By using an app, there would be a continuous measurement of PD fluctuation. Another use relates to the test of adherence using ingestible sensors.4 There are several medical‐grade apps in development that require regulatory agency clearance for marketing. The US Food and Drug Administration (FDA) categorizes an app intended for use in the diagnosis of disease, or in the cure, mitigation, treatment, or prevention of disease as a medical device; these are considered higher risk. A second pathway codified as “minimal risk apps” are those that equip users with tools to organize and track health information. These emerging applications of digital technologies prompted the need to examine the current state of the art in delivery (microelectronics) and measurement (wearable sensors) and their increasing applications in early clinical development. As with any disruptive innovation, there are several factors to be considered in their effective adoption and socialization practices.
    loading  Loading Related Articles