umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Bring Your Body into Action: Body Gesture Detection, Tracking, and Analysis for Natural Interaction
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik. (Digital Media Lab)
2014 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

Due to the large influx of computers in our daily lives, human-computer interaction has become crucially important. For a long time, focusing on what users need has been critical for designing interaction methods. However, new perspective tends to extend this attitude to encompass how human desires, interests, and ambitions can be met and supported. This implies that the way we interact with computers should be revisited. Centralizing human values rather than user needs is of the utmost importance for providing new interaction techniques. These values drive our decisions and actions, and are essential to what makes us human. This motivated us to introduce new interaction methods that will support human values, particularly human well-being.

The aim of this thesis is to design new interaction methods that will empower human to have a healthy, intuitive, and pleasurable interaction with tomorrow’s digital world. In order to achieve this aim, this research is concerned with developing theories and techniques for exploring interaction methods beyond keyboard and mouse, utilizing human body. Therefore, this thesis addresses a very fundamental problem, human motion analysis.

Technical contributions of this thesis introduce computer vision-based, marker-less systems to estimate and analyze body motion. The main focus of this research work is on head and hand motion analysis due to the fact that they are the most frequently used body parts for interacting with computers. This thesis gives an insight into the technical challenges and provides new perspectives and robust techniques for solving the problem.

Ort, förlag, år, upplaga, sidor
Umeå: Umeå Universitet , 2014. , s. 70
Serie
Digital Media Lab, ISSN 1652-6295 ; 19
Nyckelord [en]
Human Well-Being, Bodily Interaction, Natural Interaction, Human Motion Analysis, Active Motion Estimation, Direct Motion Estimation, Head Pose Estimation, Hand Pose Estimation.
Nationell ämneskategori
Signalbehandling
Forskningsämne
signalbehandling; datoriserad bildanalys
Identifikatorer
URN: urn:nbn:se:umu:diva-88508ISBN: 978-91-7601-067-9 (tryckt)OAI: oai:DiVA.org:umu-88508DiVA, id: diva2:716122
Disputation
2014-06-04, Naturvetarhuset, N420, Umeå universitet, Umeå, 13:00 (Engelska)
Opponent
Handledare
Tillgänglig från: 2014-05-14 Skapad: 2014-05-08 Senast uppdaterad: 2018-06-07Bibliografiskt granskad
Delarbeten
1. Camera-based gesture tracking for 3D interaction behind mobile devices
Öppna denna publikation i ny flik eller fönster >>Camera-based gesture tracking for 3D interaction behind mobile devices
2012 (Engelska)Ingår i: International journal of pattern recognition and artificial intelligence, ISSN 0218-0014, Vol. 26, nr 8, s. 1260008-Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Number of mobile devices such as Smartphones or Tablet PCs has been dramatically increased over the recent years. New mobile devices are equipped with integrated cameras and large displays that make the interaction with the device easier and more efficient. Although most of the previous works on interaction between humans and mobile devices are based on 2D touch-screen displays, camera-based interaction opens a new way to manipulate in 3D space behind the device in the camera's field of view. In this paper, our gestural interaction relies on particular patterns from local orientation of the image called rotational symmetries. This approach is based on finding the most suitable pattern from a large set of rotational symmetries of diffrerent orders that ensures a reliable detector for fingertips and user's gesture. Consequently, gesture detection and tracking can be used as an efficient tool for 3D manipulation in various virtual/augmented reality applications.

Nyckelord
3D mobile interaction, rotational symmetries, gesture detection, tracking, mobile applications
Nationell ämneskategori
Människa-datorinteraktion (interaktionsdesign)
Identifikatorer
urn:nbn:se:umu:diva-67979 (URN)10.1142/S0218001412600087 (DOI)000315523100006 ()
Tillgänglig från: 2013-04-09 Skapad: 2013-04-09 Senast uppdaterad: 2018-06-08Bibliografiskt granskad
2. Real 3D Interaction Behind Mobile Phones for Augmented Environments
Öppna denna publikation i ny flik eller fönster >>Real 3D Interaction Behind Mobile Phones for Augmented Environments
2011 (Engelska)Ingår i: 2011 IEEE International Conference on Multimedia and Expo (ICME), IEEE conference proceedings, 2011, s. 1-6Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Number of mobile devices such as mobile phones or PDAs has been dramatically increased over the recent years. New mobile devices are equipped with integrated cameras and large displays which make the interaction with device easier and more efficient. Although most of the previous works on interaction between humans and mobile devices are based on 2D touch-screen displays, camera-based interaction opens a new way to manipulate in 3D space behind the device in the camera's field of view. This paper suggests the use of particular patterns from local orientation of the image called Rotational Symmetries to detect and localize human gesture. Relative rotation and translation of human gesture between consecutive frames are estimated by means of extracting stable features. Consequently, this information can be used to facilitate the 3D manipulation of virtual objects in various applications in mobile devices.

Ort, förlag, år, upplaga, sidor
IEEE conference proceedings, 2011
Serie
IEEE International Conference on Multimedia and Expo, ISSN 1945-7871
Nyckelord
Mobile interaction, rotational symmetries, SIFT, 3D manipulation
Nationell ämneskategori
Interaktionsteknik
Identifikatorer
urn:nbn:se:umu:diva-52816 (URN)10.1109/ICME.2011.6012155 (DOI)000304354700154 ()978-1-61284-349-0 (ISBN)
Konferens
Multimedia and Expo (ICME), 2011 IEEE International Conference on, Barcelona, Spain, July 11-15, 2011
Tillgänglig från: 2012-03-02 Skapad: 2012-03-02 Senast uppdaterad: 2018-06-08Bibliografiskt granskad
3. 3D Active Human Motion Estimation for Biomedical Applications
Öppna denna publikation i ny flik eller fönster >>3D Active Human Motion Estimation for Biomedical Applications
2012 (Engelska)Ingår i: World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China / [ed] Mian Long, Springer Berlin/Heidelberg, 2012, , s. 4s. 1014-1017Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Movement disorders forbid many people from enjoying their daily lives. As with other diseases, diagnosis and analysis are key issues in treating such disorders. Computer vision-based motion capture systems are helpful tools for accomplishing this task. However Classical motion tracking systems suffer from several limitations. First they are not cost effective. Second these systems cannot detect minute motions accurately. Finally they are spatially limited to the lab environment where the system is installed. In this project, we propose an innovative solution to solve the above-mentioned issues. Mounting the camera on human body, we build a convenient, low cost motion capture system that can be used by the patient while practicing daily-life activities. We refer to this system as active motion capture, which is not confined to the lab environment. Real-time experiments in our lab revealed the robustness and accuracy of the system.

Ort, förlag, år, upplaga, sidor
Springer Berlin/Heidelberg, 2012. s. 4
Serie
IFMBE Proceedings, ISSN 1680-0737 ; 39
Nyckelord
Active motion tracking, Human motion analysis, Movement disorder, SIFT
Nationell ämneskategori
Biomedicinsk laboratorievetenskap/teknologi
Identifikatorer
urn:nbn:se:umu:diva-55831 (URN)10.1007/978-3-642-29305-4_266 (DOI)978-364229304-7 (print) (ISBN)978-3-642-29305-4 (ISBN)
Konferens
World Congress on Medical Physics and Biomedical Engineering (WC 2012), Beijing, China, 26-31 May 2012
Tillgänglig från: 2012-06-04 Skapad: 2012-06-04 Senast uppdaterad: 2018-06-08Bibliografiskt granskad
4. Direct three-dimensional head pose estimation from Kinect-type sensors
Öppna denna publikation i ny flik eller fönster >>Direct three-dimensional head pose estimation from Kinect-type sensors
2014 (Engelska)Ingår i: Electronics Letters, ISSN 0013-5194, E-ISSN 1350-911X, Vol. 50, nr 4, s. 268-270Artikel i tidskrift, Letter (Refereegranskat) Published
Abstract [en]

A direct method for recovering three-dimensional (3D) head motion parameters from a sequence of range images acquired by Kinect sensors is presented. Based on the range images, a new version of the optical flow constraint equation is derived, which can be used to directly estimate 3D motion parameters without any need of imposing other constraints. Since all calculations with the new constraint equation are based on the range images, Z(xyt), the existing techniques and experiences developed and accumulated on the topic of motion from optical flow can be directly applied simply by treating the range images as normal intensity images I(xyt). In this reported work, it is demonstrated how to employ the new optical flow constraint equation to recover the 3D motion of a moving head from the sequences of range images, and furthermore, how to use an old trick to handle the case when the optical flow is large. It is shown, in the end, that the performance of the proposed approach is comparable with that of some of the state-of-the-art approaches that use range data to recover 3D motion parameters.

Nationell ämneskategori
Signalbehandling
Identifikatorer
urn:nbn:se:umu:diva-86584 (URN)10.1049/el.2013.2489 (DOI)000331405200019 ()
Tillgänglig från: 2014-03-06 Skapad: 2014-03-02 Senast uppdaterad: 2018-06-08Bibliografiskt granskad
5. Head operated electric wheelchair
Öppna denna publikation i ny flik eller fönster >>Head operated electric wheelchair
2014 (Engelska)Ingår i: IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI 2014), IEEE , 2014, s. 53-56Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Currently, the most common way to control an electric wheelchair is to use joystick. However, there are some individuals unable to operate joystick-driven electric wheelchairs due to sever physical disabilities, like quadriplegia patients. This paper proposes a novel head pose estimation method to assist such patients. Head motion parameters are employed to control and drive an electric wheelchair. We introduce a direct method for estimating user head motion, based on a sequence of range images captured by Kinect. In this work, we derive new version of the optical flow constraint equation for range images. We show how the new equation can be used to estimate head motion directly. Experimental results reveal that the proposed system works with high accuracy in real-time. We also show simulation results for navigating the electric wheelchair by recovering user head motion.

Ort, förlag, år, upplaga, sidor
IEEE, 2014
Serie
IEEE Southwest Symposium on Image Analysis and Interpretation, ISSN 1550-5782
Nationell ämneskategori
Signalbehandling Datavetenskap (datalogi)
Identifikatorer
urn:nbn:se:umu:diva-86746 (URN)000355255900014 ()978-1-4799-4053-0 (ISBN)
Konferens
IEEE Southwest Symposium on Image Analysis and Interpretation SSIAI
Tillgänglig från: 2014-03-06 Skapad: 2014-03-06 Senast uppdaterad: 2018-06-08Bibliografiskt granskad
6. Direct hand pose estimation for immersive gestural interaction
Öppna denna publikation i ny flik eller fönster >>Direct hand pose estimation for immersive gestural interaction
Visa övriga...
2015 (Engelska)Ingår i: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 66, s. 91-99Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

This paper presents a novel approach for performing intuitive gesture based interaction using depth data acquired by Kinect. The main challenge to enable immersive gestural interaction is dynamic gesture recognition. This problem can be formulated as a combination of two tasks; gesture recognition and gesture pose estimation. Incorporation of fast and robust pose estimation method would lessen the burden to a great extent. In this paper we propose a direct method for real-time hand pose estimation. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Extensive experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation On two different setups; desktop computing, and mobile platform. This reveals the system capability to accommodate different interaction procedures. In addition, a user study is conducted to evaluate learnability, user experience and interaction quality in 3D gestural interaction in comparison to 2D touchscreen interaction.

Nyckelord
Immersive gestural interaction, Dynamic gesture recognition, Hand pose estimation
Nationell ämneskategori
Signalbehandling
Identifikatorer
urn:nbn:se:umu:diva-86748 (URN)10.1016/j.patrec.2015.03.013 (DOI)000362271100011 ()
Tillgänglig från: 2014-03-06 Skapad: 2014-03-06 Senast uppdaterad: 2018-06-08Bibliografiskt granskad
7. Telelife: An immersive media experience for rehabilitation
Öppna denna publikation i ny flik eller fönster >>Telelife: An immersive media experience for rehabilitation
2014 (Engelska)Ingår i: Proceedings of the 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), IEEE, 2014Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

In recent years, emergence of telerehabilitation systems for home-based therapy has altered healthcare systems. Telerehabilitation enables therapists to observe patients status via Internet, thus a patient does not have to visit rehabilitation facilities for every rehabilitation session. Despite the fact that telerehabilitation provides great opportunities, there are two major issues that affect effectiveness of telerehabilitation: relegation of the patient at home, and loss of direct supervision of the therapist. Since patients have no actual interaction with other persons during the rehabilitation period, they will become isolated and gradually lose their social skills. Moreover, without direct supervision of therapists, rehabilitation exercises can be performed with bad compensation strategies that lead to a poor quality recovery. To resolve these issues, we propose telelife, a new concept for future rehabilitation systems. The idea is to use media technology to create a totally new immersive media experience for rehabilitation. In telerehabilitation patients locally execute exercises, and therapists remotely monitor patients' status. In telelife patients, however, remotely perform exercises and therapists locally monitor. Thus, not only telelife enables rehabilitation at distance, but also improves the patients' social competences, and provides direct supervision of therapists. In this paper we introduce telelife to enhance telerehabilitation, and investigate technical challenges and possible methods to achieve telelife.

Ort, förlag, år, upplaga, sidor
IEEE, 2014
Nationell ämneskategori
Signalbehandling
Identifikatorer
urn:nbn:se:umu:diva-88093 (URN)10.1109/APSIPA.2014.7041675 (DOI)000392861900163 ()978-6-1636-1823-8 (ISBN)
Konferens
2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), 9-12 December 2014, Chiang Mai, Thailand
Tillgänglig från: 2014-04-23 Skapad: 2014-04-23 Senast uppdaterad: 2018-06-08Bibliografiskt granskad

Open Access i DiVA

fulltext(4331 kB)580 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 4331 kBChecksumma SHA-512
e1627e91765a6d9a5ffe1d95942c866f788dbdbfe94f506f04fdb32d224616c88fecd36fd854c69b18f9aab9e844066889db84ccde97a2f81e0e71170023189c
Typ fulltextMimetyp application/pdf

Personposter BETA

Abedan Kondori, Farid

Sök vidare i DiVA

Av författaren/redaktören
Abedan Kondori, Farid
Av organisationen
Institutionen för tillämpad fysik och elektronik
Signalbehandling

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 580 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

isbn
urn-nbn

Altmetricpoäng

isbn
urn-nbn
Totalt: 3666 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf