Vibrotactile rendering of head gestures for controlling electric wheelchair
2009 (English)In: Proceedings of IEEE international conference on systems, man and cybernetics, San Antonio, Texas, USA: IEEE , 2009, 413-417 p.Conference paper (Refereed)
We have developed a head gesture controlled electric wheelchair system to aid persons with severe disabilities. Real-time range information obtained from a stereo camera is used to locate and segment the face images of the user from the sensed video. We use an Isomap based nonlinear manifold learning map of facial textures for head pose estimation. Our system is a non-contact vision system, making it much more convenient to use. The user is only required to gesture his/her head to command the wheelchair. To overcome problems with a non responding system, it is necessary to notify the user of the exact system state while the system is in use. In this paper, we explore the use of vibrotactile rendering of head gestures as feedback. Three different feedback systems are developed and tested, audio stimuli, vibrotactile stimuli and audio plus vibrotactile stimuli. We have performed user tests to study the usability of these three display methods. The usability studies show that the method using both audio plus ibrotactile response outperforms the other methods (i.e. audio stimuli, vibrotactile stimuli response).
Place, publisher, year, edition, pages
San Antonio, Texas, USA: IEEE , 2009. 413-417 p.
extended Isomap, Multidimensional Scaling (MDS), head gesture recognition, vibrotactile rendering, wheelchair system, usability.
Research subject Computerized Image Analysis
IdentifiersURN: urn:nbn:se:umu:diva-32993DOI: 10.1109/ICSMC.2009.5346213ISBN: 978-1-4244-2793-2OAI: oai:DiVA.org:umu-32993DiVA: diva2:308424
IEEE international conference on systems, man and cybernetics, San Antonio, Texas, USA, 2009