Lab Helps Create Virtual Impaired Driver for Identifying DUIs

Editors’ Note: This feature appears as it was published in the summer 2019 edition of UT Dallas Magazine. Titles or faculty members listed may have changed since that time.
Officer using virtual DUI simulation 
A UT Dallas professor has teamed up with researchers at Sam Houston State University and EyeT Plus to create a virtual impaired driver—nicknamed “Brian”—that is helping to train police officers to identify one of the strongest clues to driving under the influence: twitchy eyes.
Brian is the star performer of a project called Individual Nystagmus Simulated Training Experience (INSITE), which helps identify horizontal gaze nystagmus (HGN). This condition—evaluated in a standard field sobriety test—is the involuntary jerking or twitching of a driver’s eyes as they follow an officer’s finger as it moves from side to side in front of the driver’s face. How soon the eyes begin to twitch as the officer conducts the horizontal gaze nystagmus test indicates the driver’s impairment level.
“HGN is subtle, and sometimes hard to recognize,” said Dr. Marjorie Zielke PhD’07, director of the Center for Modeling and Simulation/Virtual Humans and Synthetic Societies Lab at UT Dallas. “Simply put: The eyes don’t lie.”
The presence of HGN allows an officer to determine whether a driver is under the influence, said Sgt. Matthew Dusek, a drug recognition expert in the Northeast Police Department near Denton, Texas.
“HGN is an important indicator and one of the best clues that a drunken-driving arrest may be the appropriate action. Using Brian, a virtual trainer, is sure to boost officer confidence,” Dusek said.
Zielke’s team developed Brian in partnership with Sam Houston State University, which received a grant from the Texas Department of Transportation, and EyeT Plus, a group of police training subject-matter experts. INSITE is used throughout Texas as part of the Advanced Roadside Impaired Driving Enforcement (ARIDE) training program directed by Sam Houston State.

“HGN [horizontal gaze nystagmus] is an important indicator and one of the best clues that a drunken-driving arrest may be the appropriate action. Using Brian, a virtual trainer, is sure to boost officer confidence.”
– Sgt. Matthew Dusek

Consisting of the head and shoulders of an adult male, Brian combines 3D modeling, programming and animation to create a lifelike appearance. Mathematical algorithms map Brian’s physical features and eyes, which follow and twitch when an officer moves his finger horizontally in front of the screen, which is topped by a camera. The system is equipped with analytical algorithms to give feedback on the officer’s performance and technique.
Brian can be programmed to represent various levels of blood alcohol content (BAC). Instructors can customize the BAC level, plus create a number of scenarios using different settings for eye redness, wetness, pupil size and pupil dissimilarity.
“The research team would like to get Brian into more police training programs,” Zielke said. “But the question is additional funding. At the moment, INSITE is positioned to help educate up to 500 officers in 2019.”
The UT Dallas team also is researching how to use Brian for identifying impairment due to use of marijuana and other drugs.