Sean Lenox BA’15, research engineer in the modeling and simulation lab at UT Dallas, uses the Microsoft HoloLens to test the functionality of virtual character “Walter” by asking questions and listening to his responses.

Sean Lenox BA’15, research engineer in the modeling and simulation lab at UT Dallas, uses the Microsoft HoloLens to test the functionality of virtual character “Walter” by asking questions and listening to his responses.

With new federal funding, a University of Texas at Dallas research team will study how augmented reality can help medical students prepare for interactions with real patients.

The research, funded by a $750,000 grant from the National Science Foundation (NSF), will look at a number of virtual interactions, such as virtual peers, virtual professors and virtual patients, with a goal of determining the best methods for social learning.

“Augmented reality means you’re taking reality and adding something virtual to it,” said Marjorie Zielke PhD’07, research professor and director of the Center for Modeling and Simulation/Virtual Humans and Synthetic Societies Lab. “In this research, we are exploring what role real peers, virtual peers, real professors and virtual professors play in learning about patient communication.”

As part of its work, the team is developing an emotive virtual reality patient, which will allow medical students to improve their patient communication skills. One device the team is using is the Microsoft HoloLens, a head-mounted display resembling goggles that creates immersive, augmented reality holograms.

Designed to complement other training methods, the virtual humans will possess a lifelike ability to have a conversation and convey emotion, which Zielke said is particularly important in the patient-physician relationship process. Zielke said the virtual patients also can exhibit symptoms to help medical students improve their nonverbal communication skills.

“A medical professional has to be able to read between the lines. This is where emotional intelligence, communication skills and the ability to discern nonverbal cues come in,” she said. “Students can practice more often with virtual entities than with actors or mannequins, and they are able to get feedback in a more flexible way. Our research is as much about learning science as it is about technology development. We are trying to explore the future of learning with this work.”

“Students can practice more often with virtual entities than with actors or mannequins, and they are able to get feedback in a more flexible way. Our research is as much about learning science as it is about technology development. We are trying to explore the future of learning with this work.”

Marjorie Zielke PhD’07, research professor and director of the Center for Modeling and Simulation/Virtual Humans and Synthetic Societies Lab

Zielke is collaborating on the project with UT Southwestern Medical Center researchers Dr. Robert Rege, associate dean for undergraduate medical education; and Dr. James Wagner, associate dean for LCME accreditation and educational outcomes; as well as Dr. Scotty Craig from Arizona State University. It extends research that was previously funded by Southwestern Medical Foundation and the NSF’s US Ignite high-speed network research program.

As part of its work, the research team is developing a natural-language interface capable of responsive and realistic communication. The group also is compiling data on body language, facial cues and other physiological information. An important part of the research is to discern what value high-speed computer networks add to the creation and deployment of virtual humans.

“We’re hoping to transmit our virtual humans at high speeds over a network that has the ability to transfer a lot of data in a short amount of time without delays,” Zielke said. “We want to take the virtual humans and integrate them in a way so that students can prepare for their exams through practice sessions. Transmitting our virtual humans over advanced networks is a key part of those efforts.

“We don’t want the virtual humans to be linear, where there is just a choice of canned responses. We want it to be as if you’re interacting with a real person on all levels. That is our goal.”

“Walter” joins members of the Center for Modeling and Simulation/Virtual Humans and Synthetic Societies Lab, which is led by Dr. Marjorie Zielke, research professor (seated, second from right).