Dr. Carlos Busso

Dr. Carlos Busso has received a National Science Foundation Faculty Early Career Development (CAREER) Award, which provides nearly $500,000 in funding over the next five years.

Dr. Carlos Busso hopes computers will one day sense how you’re feeling. The associate professor of electrical engineering is designing speech recognition tools that understand human emotion.  

To further his research, Busso has received a National Science Foundation Faculty Early Career Development (CAREER) Award, which provides nearly $500,000 in funding over the next five years.

“Identifying and characterizing emotional behavior is challenging, but it’s an important research topic for human-computer interaction,” said Busso, whose Multimodal Signal Processing Lab is housed in the Erik Jonsson School of Engineering and Computer Science. “The shortcomings in current algorithms that recognize expressive behaviors during natural human interaction is the key barrier to using emotion-aware technology in real-life applications.”

Busso plans to create new algorithms that recognize spontaneous behaviors from speech and that capture the underlying externalization process of emotions in real-world conditions.

“My proposed models and algorithms promise insights to explore and extend theories in linguistic and paralinguistic human behaviors,” he said. “Several new scientific avenues can emerge that serve as truly innovative advancements that will impact applications in security and defense, next generation of advanced user interfaces, health behavior informatics and education.”

Busso is in the early stages of his research. His first goal is to identify speeches from open access sources and social media to create a database of emotionally charged sentences to analyze. Once he has obtained the speeches, he will use crowdsourcing techniques to allow trained ears to label the speech segments according to basic classifications of emotion — happy, sad or angry, for example.

“With all this data, we can evaluate models. It gives us an opportunity to see how well our models will work in real-life settings,” Busso said. “Our goal is that the dataset reaches 100 hours of annotated emotional data.”

The role of human-centered technologies can inspire young scholars into computing and engineering. The field can attract and nurture everyone from high school students to doctoral trainees

Dr. Carlos Busso,
associate professor of electrical engineering

Busso is already relying on the help of his students for gathering the data and hopes his work will encourage more young researchers to become involved.

“The role of human-centered technologies can inspire young scholars into computing and engineering. The field can attract and nurture everyone from high school students to doctoral trainees,” he said “This research provides a unique opportunity to increase engagement in K-12, undergraduate and graduate students with human-centered algorithms, and sensing technologies contextualized in real life applications.”

Busso, who is from Chile, also serves as a mentor and role model for high school, undergraduate and graduate students. Through lab open houses, demonstrations, and active online and social media presence, he hopes to bring his research to nontraditional students, as well as the broader, nontechnical audience interested in human behavior science.

Busso is one of 14 Jonsson School faculty members supported by CAREER awards. Other 2015 recipients are Dr. Zhiqiang Lin, assistant professor of computer science, Dr. Bilal Akin, associate professor of electrical engineering, and Dr. Fatemeh Hassanipour, assistant professor of mechanical engineering.

Busso was also the inaugural recipient of a 10-Year Technical Impact Award given by the Association for Computing Machinery International Conference on Multimodal Interaction for his work on one of the first studies about audiovisual emotion recognition. The work analyzed the limitations in solely detecting emotions from speech or facial recognition, and discussed the benefits of using both modalities at the same time.