Emotion recognition and estimation from tracked lip features
2004 (English)Report (Other academic)
Humans transmit and display information through the visual image of our lips; information about our speech and our emotions and expressions. By being able to track the movement of the lips, computers can make use of the visual part of the information. This information can be used for multiple purposes. Other research has mainly been focused on using lip tracking for speechreading, but here we focus on how to make use of tracked lip features for emotions. We have found that people are better at interpreting basic emotions displayed through an animation of lips than interpreting the same emotions displayed through a real video sequence that shows the lower part of the face. We have successfully transferred three basic emotions from visual information into information from another modality; touch.
Place, publisher, year, edition, pages
Umeå: Tillämpad fysik och elektronik , 2004. , 6 p.
DML Technical Report, ISSN 1652-8441 ; 5
Signalbehandling, lip animation, emotion recognition, lip tracking
IdentifiersURN: urn:nbn:se:umu:diva-412OAI: oai:DiVA.org:umu-412DiVA: diva2:143396