iFeeling: Vibrotactile rendering of human emotions on mobile phones
2010 (English)In: Mobile multimedia processing: fundamentals, methods, and applications / [ed] Xiaoyi Jiang, MatthewY. Ma, Chang Wen Chen, Heidelberg, Germany: Springer Berlin , 2010, 1st Edition, 1-20 p.Chapter in book (Other academic)
Today, the mobile phone technology is mature enough to enable us to effectively interact with mobile phones using our three major senses namely, vision, hearing and touch. Similar to the camera, which adds interest and utility to mobile experience, the vibration motor in a mobile phone could give us a new possibility to improve interactivity and usability of mobile phones. In this chapter, we show that by carefully controlling vibration patterns, more than 1-bit information can be rendered with a vibration motor. We demonstrate how to turn a mobile phone into a social interface for the blind so that they can sense emotional information of others. The technical details are given on how to extract emotional information, design vibrotactile coding schemes, render vibrotactile patterns, as well as how to carry out user tests to evaluate its usability. Experimental studies and users tests have shown that we do get and interpret more than one bit emotional information. This shows a potential to enrich mobile phones communication among the users through the touch channel.
Place, publisher, year, edition, pages
Heidelberg, Germany: Springer Berlin , 2010, 1st Edition. 1-20 p.
, Lecture notes in computer science, ISSN 0302-9743 (print), 1611-3349 (online) ; 5960
emotion estimation, vibrotactile rendering, lip tracking, mobile communication, tactile coding, mobile phone.
Research subject Computerized Image Analysis
IdentifiersURN: urn:nbn:se:umu:diva-32998DOI: 10.1007/978-3-642-12349-8_1ISBN: 978-3-642-12348-1OAI: oai:DiVA.org:umu-32998DiVA: diva2:308441