‘Projection reality’ addresses gap in education
Written by Emil Venere
A new system that projects changing 3D facial features and expressions onto mannequins used in simulations could provide a new dimension of teaching for nursing students.
Conventional simulations using mannequins for nursing education are unable to re-create the dynamic, sometimes subtle facial expressions and nonverbal cues that can reveal whether a patient has suffered a stroke. For example, the mannequins’ eyes blink, but their mouths don’t move, and they are incapable of depicting the facial drooping often seen after a stroke.
“In a simulation we're trying to create a real-life situation,” says Amy Nagle, a clinical assistant professor of nursing who has expertise in the use of simulations. “Right now, the mannequins that we are using, their faces are very static. They can't have any expressions. Their mouths don’t move. It's not 100% realistic.”
She is working to change that, in research funded with a two-year $100,000 grant through the Instructional Innovation Program, sponsored by the Office of the Provost and the Vice President for Information Technology. The project, a collaboration between the Purdue Envision Center, the College of Engineering and the School of Nursing, aims to design a new system that will use 3D “virtual projection to enhance nursing intuition.”
Leading the research with Nagle are George Takahashi, technical lead in the Envision Center; Denny Yu, assistant professor of industrial engineering; and Bradley Duerstock, associate professor of engineering practice. The project team also includes School of Nursing faculty members Tera Hornbeck, Beth Smith and Ann Loomis, who have expertise in simulation.
"The integration of this tool offers opportunities to practice rare and critical events that nursing students may never experience throughout their undergraduate education."
Amy Nagle
Clinical Assistant Professor of Nursing
The technology applies principles of “augmented reality,” where actual objects are enhanced with layers of virtual features, as opposed to virtual reality, which is entirely generated in a synthetic world. “The term we are using is 3D projection reality,” Nagle says.
The system appears to be the only one of its kind for nursing education, although similar approaches are being used in other kinds of medical applications.
The School of Nursing currently operates a simulation lab, where a faculty member sits behind a one-way mirror, controlling simulation mannequins as students encounter various scenarios.
“We can do some things with the mannequin. We can make it cough. We can give them heart sounds, and lung sounds and things like that,” Nagle says. “However, we cannot do anything with the face. Our hope is that eventually we would be able to make the face move, show emotion or respond to what the students are saying. So, it wouldn't just be a mannequin looking at you with no expression.”
Before the virtual face is projected onto a mannequin, a mask is generated with a 3D printer. This mask is then placed over the mannequin’s face, making it easier to project a virtual face onto the simulation mannequin.
“We want to artificially create a full face and manipulate what they are doing and saying based on certain parameters,” says Takahashi, who likened the system to a sort of flight-training simulator for nursing.
The researchers will modify commercially available 3D projection technology, adding a tracking system to correctly superimpose the virtual face.
“The implementation is where the difficulty lies,” Takahashi says. “The hardware already exists. You can buy a lot of this stuff off the shelf. The software is the magic.”
The projector system is relatively portable, possibly compact enough to mount on a bed in the simulation lab.
“The integration of this tool offers opportunities to practice rare and critical events that nursing students may never experience throughout their undergraduate education,” Nagle says.
Stroke, for example, is characterized by symptoms including pupil dilation, facial drooping, slurred speech, extremity weakness and facial grimacing.
“These are symptoms that we are unable to represent in our current simulation state, and the chance that students would see a patient have an acute stroke in one of their clinical rotations is improbable,” she says. “We are trying to address this educational gap.”
The simulations are expected to be gradually introduced over the next two years and will be used for undergraduates in several senior-level nursing courses. The system may undergo a trial run in the Transitions to Practice course.
“The grant is to trial it out in the lab with the hopes that we can develop the technology, and then the technology could be used in the classroom and other places,” Nagle says.
Yu and Duerstock will focus on objectively measuring and interpreting the students’ physiological responses to the simulations, helping the team assess how well the teaching tool is working.
“This is very new,” Yu says. “We want to determine objectively what kind of behavior changes are occurring while a student is using the simulation.”
Students will wear glasses containing sensors to track and monitor eye movements, gaze and pupil-diameter changes.
“We want to know what they are looking at, how long they are looking at it, their patterns of eye movements,” Yu says.
Other metrics will scrutinize heart-rate variability and skin conductance, a measure of emotional arousal. Skin-conductance sensors, worn on a finger, detect electrical activity across skin.
“When there is more physiological arousal, more stress, the signal will be stronger,” Yu says.
The project will involve several students from Engineering, Nursing and the Envision Center.