Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Tongue operated electric wheelchair
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik. (Digital Media lab.)
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
2010 (Engelska)Konferensbidrag, Publicerat paper (Övrigt vetenskapligt)
Abstract [en]

In this paper we propose a tongue operated electric wheelchair system to aid persons with severe disabilities. Real-time tongue gestures are detected and estimated from a video camera pointing to the face of the user. The tongue gestures are used to control the wheelchair. Our system is a non-contact vision system, making it much more convenient to use. The user is only required to move his/her tongue to command the wheelchair. To make the system easy to drive, the system is also equipped with a laser scanner for obstacle avoidance. We also mount a 2D array of vibrators on the chair to provide the user with the information of the response from his/her tongue movement and the status of the system. We are carrying out user tests to measure the usability of the system.

Ort, förlag, år, upplaga, sidor
2010. s. 133-136
Nationell ämneskategori
Signalbehandling
Forskningsämne
datoriserad bildanalys
Identifikatorer
URN: urn:nbn:se:umu:diva-32999OAI: oai:DiVA.org:umu-32999DiVA, id: diva2:308445
Konferens
Swedish Symposium on Image Analysis
Tillgänglig från: 2012-03-29 Skapad: 2010-04-06 Senast uppdaterad: 2018-06-08Bibliografiskt granskad
Ingår i avhandling
1. Expressing emotions through vibration for perception and control
Öppna denna publikation i ny flik eller fönster >>Expressing emotions through vibration for perception and control
2010 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Alternativ titel[en]
Expressing emotions through vibration
Abstract [en]

This thesis addresses a challenging problem: “how to let the visually impaired ‘see’ others emotions”. We, human beings, are heavily dependent on facial expressions to express ourselves. A smile shows that the person you are talking to is pleased, amused, relieved etc. People use emotional information from facial expressions to switch between conversation topics and to determine attitudes of individuals. Missing emotional information from facial expressions and head gestures makes the visually impaired extremely difficult to interact with others in social events. To enhance the visually impaired’s social interactive ability, in this thesis we have been working on the scientific topic of ‘expressing human emotions through vibrotactile patterns’.

It is quite challenging to deliver human emotions through touch since our touch channel is very limited. We first investigated how to render emotions through a vibrator. We developed a real time “lipless” tracking system to extract dynamic emotions from the mouth and employed mobile phones as a platform for the visually impaired to perceive primary emotion types. Later on, we extended the system to render more general dynamic media signals: for example, render live football games through vibration in the mobile for improving mobile user communication and entertainment experience. To display more natural emotions (i.e. emotion type plus emotion intensity), we developed the technology to enable the visually impaired to directly interpret human emotions. This was achieved by use of machine vision techniques and vibrotactile display. The display is comprised of a ‘vibration actuators matrix’ mounted on the back of a chair and the actuators are sequentially activated to provide dynamic emotional information. The research focus has been on finding a global, analytical, and semantic representation for facial expressions to replace state of the art facial action coding systems (FACS) approach. We proposed to use the manifold of facial expressions to characterize dynamic emotions. The basic emotional expressions with increasing intensity become curves on the manifold extended from the center. The blends of emotions lie between those curves, which could be defined analytically by the positions of the main curves. The manifold is the “Braille Code” of emotions.

The developed methodology and technology has been extended for building assistive wheelchair systems to aid a specific group of disabled people, cerebral palsy or stroke patients (i.e. lacking fine motor control skills), who don’t have ability to access and control the wheelchair with conventional means, such as joystick or chin stick. The solution is to extract the manifold of the head or the tongue gestures for controlling the wheelchair. The manifold is rendered by a 2D vibration array to provide user of the wheelchair with action information from gestures and system status information, which is very important in enhancing usability of such an assistive system. Current research work not only provides a foundation stone for vibrotactile rendering system based on object localization but also a concrete step to a new dimension of human-machine interaction.

Ort, förlag, år, upplaga, sidor
Umeå: Umeå universitet, Institutionen för tillämpad fysik och elektronik, 2010. s. 159
Serie
Digital Media Lab, ISSN 1652-6295 ; 12
Nyckelord
Multimodal Signal Processing, Mobile Communication, Vibrotactile Rendering, Locally Linear Embedding, Object Detection, Human Facial Expression Analysis, Lip Tracking, Object Tracking, HCI, Expectation-Maximization Algorithm, Lipless Tracking, Image Analysis, Visually Impaired.
Nationell ämneskategori
Signalbehandling Datorgrafik och datorseende Datavetenskap (datalogi) Telekommunikation Systemvetenskap, informationssystem och informatik
Forskningsämne
datoriserad bildanalys; administrativ databehandling; elektronik; systemanalys
Identifikatorer
urn:nbn:se:umu:diva-32990 (URN)978-91-7264-978-1 (ISBN)
Disputation
2010-04-28, Naturvetarhuset, N300, Umeå universitet, Umeå, Sweden, 09:00 (Engelska)
Opponent
Handledare
Projekt
Taktil Video
Tillgänglig från: 2010-04-07 Skapad: 2010-04-06 Senast uppdaterad: 2025-02-01Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Person

ur Réhman, ShafiqRönnbäck, SvenLiu, Li

Sök vidare i DiVA

Av författaren/redaktören
ur Réhman, ShafiqRönnbäck, SvenLiu, Li
Av organisationen
Institutionen för tillämpad fysik och elektronik
Signalbehandling

Sök vidare utanför DiVA

GoogleGoogle Scholar

urn-nbn

Altmetricpoäng

urn-nbn
Totalt: 1059 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf