U of Pittsburgh Tests Face Emotion Icons

Read about facial expression synthesis developed by the University of Pittsburgh for using emotion icons in the virtual classroom.

Recommendations are independently chosen by Reviewed’s editors. Purchases you make through our links may earn us a commission.

April 2, 2007 – Most online chatters are familiar with Instant Messaging icons such as ":-)[ or ](-("), used for indicating their emotional state over the Web. What if you could send IMs with a picture of yourself, changing your facial expression just as easily? Researchers at the University of Pittsburgh have made it possible with Face Alive Icons, which are emotion icons created from a single photo of a user. 


ASCII characters for IMs have long been popular among online chatters but lack real-life emotion, according to Face Alive researcher and current Google software engineer Xin Li. Face Alive Icons take expressing emotions over the Web to the next level by using a real portrait photo.

The University of Pittsburgh researchers originally developed their own version emotion icons, also known as "emoticons," for distance learning over the Web. In an online classroom setting, a user can change the expression of their Face Alive Icon, representing their understanding of the school material. For example, if a student user was unsatisfied with the virtual lesson, he could select a menu option, changing his face to sad.

The new Face Alive Icons can establish a face-to-face communication between the teacher and student, according to Li. "It has a lot of potential in the virtual classroom…We can express real feelings with real images," he said.

When asked why not simply use a webcam that can attach to a computer for live classroom interaction, University of Pittsburg Computer Science Professor and abstract co-author Dr. Shi-Kuo Chang said that the Face Alive technology has greater flexibility for low bandwidth areas. If a student is using dial-up Internet, a Face Alive Icon could be used for quicker interaction.

Face Alive Icons, developed by Xin Li, Shi-Kuo Chang and Chieh-Chih Chang of the Industrial Technology Research Institute in Taiwan, is a hybrid of existing warping and morphing software. Their approach is made of up of two processes. A front-facing portrait of the user is first decomposed into static facial features such as the nose, ears, and hair – objects that remain the same regardless of emotion. The icon is then synthesized with combinations of certain expressional features – changes in the eyes and mouth that typically evoke feeling - to create a Facial Icon Profile (FIP). The software modifies****key points of the expressional features to denote different emotions, such as happiness, sadness, surprise, anger, disgust, and fear.

Related content

When tested for accuracy, 93 percent of people could recognize a Face Alive Icon as "Surprise," whereas a lower 69 percent of people recognized the icon for "Disgusted."


Still being used on an experimental basis at University of Pittsburgh, Face Alive Icons could also have broader applications for mobile device communication. Because the icons are small files (64 x 64 pixels), they are well-suited to portable devices, according to the researchers.

"Young people use those stylized visual messages all the time. We can combine Face Alive with more personalization to enrich the communication process," said Chang.

*Other contributions to Face Alive research include Dr. M.J. Lyons for JAFFE data and Jui-Hsin Huang for images. *

Up next