iFeeling: vibrotactile rendering of human emotions on mobile phones
2010 (English)In: Mobile multimedia processing: fundamentals, methods, and applications, Springer, 2010, 1-20 p.Conference paper (Refereed)
Today, the mobile phone technology is mature enough to enable us to effectively interact with mobile phones using our three major senses namely, vision, hearing and touch. Similar to the camera, which adds interest and utility to mobile experience, the vibration motor in a mobile phone could give us a new possibility to improve interactivity and usability of mobile phones. In this chapter, we show that by carefully controlling vibration patterns, more than 1-bit information can be rendered with a vibration motor. We demonstrate how to turn a mobile phone into a social interface for the blind so that they can sense emotional information of others. The technical details are given on how to extract emotional information, design vibrotactile coding schemes, render vibrotactile patterns, as well as how to carry out user tests to evaluate its usability. Experimental studies and users tests have shown that we do get and interpret more than one bit emotional information. This shows a potential to enrich mobile phones communication among the users through the touch channel.
Place, publisher, year, edition, pages
Springer, 2010. 1-20 p.
, Lecture Notes in Computer Science, ISSN 0302-9743 ; 5960
emotion estimation, vibrotactile rendering, lip tracking, mobile communication, tactile coding, mobile phone
Computer Science Telecommunications
IdentifiersURN: urn:nbn:se:umu:diva-109045ISI: 000278828400001ISBN: 978-3-642-12348-1OAI: oai:DiVA.org:umu-109045DiVA: diva2:856633
1st International Workshop on Mobile Multimedia Processing, DEC 07, 2008, Tampa, FL