Mixed-Reality Simulation for Neurosurgical Procedures

    loading  Checking for direct PDF access through Ovid

Abstract

BACKGROUND:

Surgical education is moving rapidly to the use of simulation for technical training of residents and maintenance or upgrading of surgical skills in clinical practice. To optimize the learning exercise, it is essential that both visual and haptic cues are presented to best present a real-world experience. Many systems attempt to achieve this goal through a total virtual interface.

OBJECTIVE:

To demonstrate that the most critical aspect in optimizing a simulation experience is to provide the visual and haptic cues, allowing the training to fully mimic the real-world environment.

METHODS:

Our approach has been to create a mixed-reality system consisting of a physical and a virtual component. A physical model of the head or spine is created with a 3-dimensional printer using deidentified patient data. The model is linked to a virtual radiographic system or an image guidance platform. A variety of surgical challenges can be presented in which the trainee must use the same anatomic and radiographic references required during actual surgical procedures.

RESULTS:

Using the aforementioned techniques, we have created simulators for ventriculostomy, percutaneous stereotactic lesion procedure for trigeminal neuralgia, and spinal instrumentation. The design and implementation of these platforms are presented.

CONCLUSION:

The system has provided the residents an opportunity to understand and appreciate the complex 3-dimensional anatomy of the 3 neurosurgical procedures simulated. The systems have also provided an opportunity to break procedures down into critical segments, allowing the user to concentrate on specific areas of deficiency.

ABBREVIATIONS:

IGW, image-guided workstation

ABBREVIATIONS:

RFL, radiofrequency lesion

Related Topics

    loading  Loading Related Articles