Soft biometrics research may help advertisers, retailers
By Bridget Maiellaro, ECE Illinois
July 16, 2008
- Prof. Thomas Huang and his research team are working on "Electronic Consumer Releations Management," which aims to improve future technologies.
- Huang is hoping to create computers that can recognize aspects of a person like gender, age, and ethnicity to study people's buying habits or reactions to marketing.
ECE Professor Thomas S. Huang and his graduate students are currently developing and improving ways for society to communicate on a more personal level. Through a variety of algorithms and applications, the researchers are experimenting to make technological advances even more appealing.
Huang, along with graduate students Xun (Jason) Xu, Zhen Li, Xi Zhou, Hao Tang, Yuxiao Hu and Liangliang Cao, has been studying "Electronic Consumer Relations Management," or ECRM, to learn more about human behavior. One part of their research involves collecting demographic data by tracking what percentages of people buy certain items.
"For instance, fast food companies may want to know what percentage of males buy a hamburger or what percentage of teenagers buy chicken," Huang said. "If you have the camera, the microphone at the cashier looking at the customer the computer can recognize gender, age, and possibly ethnicity. Then you can collect computer statistics."
Huang said that the research involves soft biometrics because they want to recognize some aspects of each person, such as gender, age and ethnicity; they do not want to find their full identities. Therefore the researchers have created a variety of algorithms to determine those aspects of an individual based on facial characteristics and tone of voice.
- The gender from face system, created by Xun Xu, focuses on the face of the person in front of the camera, detecting and framing the face with a green box on the computer screen, and determines whether the person is male or female based on facial features. When the program determines the subject is female, the face is framed with a red box. For subjects deemed male, the box turns blue. Meanwhile, if the algorithm is unable to determine the person’s gender, the box remains green.
- Created by Zhen Li, the gender from speech program assesses whether a person is male or female by recording a short segment of a person’s speech. After the recording is processed, the algorithm makes its final analysis and decides on male or female, based on the person’s pitch and other vocal characteristics. While the algorithms are currently separate, Huang said that the idea is to combine the voice and the face applications to determine a person’s gender.
- Xi Zhou is currently constructing an age estimation program. While a demonstration is still in the works, Huang said that the algorithm developed and used on some databases shows the accuracy of age estimation is plus or minus six years. The group hopes to have its program ready for testing within a few weeks.
While these programs show potential, Huang said a major concern is not being able to see a person’s frontal view face on the video camera.
"So far, we have worked mainly with faces," Huang said. "And so far, most of the work, done not only by us but by other groups, is mainly based on frontal view. In many situations, however, you may not have the frontal view; you may only have a side view."
Thus, the group is working to improve its current algorithms in order to recognize a person’s gender, age, and ethnicity from non-frontal views.
The researchers are creating algorithms to track body movements and obtain cues that may help determine a person’s gender. From these supplementary features, Huang believes they also will be able to read emotions better.
"Our algorithms of recognizing emotion and gender and so forth are evolving," Huang said. "So hopefully, they are getting better and better. For each of the algorithms, we want to make them faster and more robust."
Editor's note: media inquiries should be directed to Brad Petersen, Director of Communications, at firstname.lastname@example.org or (217) 244-6376.