Multi-Sensory, Interactive Training Unit Lets Combat Medics See, Feel and Treat Lifelike, 3D Wounds
UCF researchers have invented an innovative, multi-sensory, interactive training system that realistically mimics wounds and provides constant, dynamic feedback to medical trainees as they treat the wounds. Almost like a video game in real-life, the Tactile-Visual Wound (TVW) Simulation Unit portrays the look, feel, and even the smell of different types of human wounds (such as a puncture, stab, slice or tear). It also tracks and analyzes a trainee’s treatment responses and provides corrective instructions.
Other systems are either static, physical wound models (made from colored rubber, for example), or they are virtual reality models rendered by computer graphics onto a monitor or a head-worn display. Although some hands-on training uses moulage wounds (mock injuries) on physical bodies, trainees must still rely on their instructors to describe how real-life injuries behave and respond to treatment.
As a solution, the new TVW unit combines hands-on, tactile experiences with simulated, dynamic wounds and casualty responses—drastically increasing the realism and overall quality of medical training. Trainees can practice their skills on interactive manikins and moulage wounds that simulate the way a particular wound looks, feels and smells. They can also receive real-time feedback from the system about their skills and techniques. Additionally, automated instruction enables personnel to receive training anytime, anywhere.
The TVW invention is a multi-sensory wound simulation unit. By combining several technologies, the invention provides an immersive experience for trainees. A TVW unit can include augmented reality software and a headset; sensors; actuators and markers integrated into a medical manikin; and a computer processor. An alternative configuration uses interactive moulage components affixed to a real person instead of a manikin. When activated, the unit’s AR system continuously tracks the TVW, estimates the deformation of the wound over time, and monitors its response to treatment. For example, a trainee might see (via the AR glasses or headset) a projection that shows blood flowing out of the manikin’s wound and vital signs "dropping." When the trainee applies pressure to the wound, sensors detect the action and wirelessly relay the data to the AR system. In response, the AR system renders (via computer graphics) an appropriate dynamic view of the blood loss slowing, and the physiological simulation reflects stabilized vitals. Real-time depth or other models of the trainee's hands, medical devices, and so on, can also affect the simulated visuals that the AR rendering system generates. For example, blood could to appear to flow out of the wound and over the hands, until appropriate pressure stops the blood flow. A TVW matches the shape and size of a particular wound (such as a puncture or tear), and the outer surface of the device, made of silicon-based material, feels like skin. The device can run on batteries or connect directly to an external power source.
- Simulates all aspects of wound behavior (sight, sound, touch and smell)
- Provides continual visual, physical and audio feedback regarding treatment, skills and techniques
- Enables administrators to change simulation conditions dynamically in real time
- Trauma/critical care training for medical professionals (EMT rescue squads, paramedics, combat medics)