umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Bring Your Body into Action: Body Gesture Detection, Tracking, and Analysis for Natural Interaction
Umeå University, Faculty of Science and Technology, Department of Applied Physics and Electronics. (Digital Media Lab)
2014 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Due to the large influx of computers in our daily lives, human-computer interaction has become crucially important. For a long time, focusing on what users need has been critical for designing interaction methods. However, new perspective tends to extend this attitude to encompass how human desires, interests, and ambitions can be met and supported. This implies that the way we interact with computers should be revisited. Centralizing human values rather than user needs is of the utmost importance for providing new interaction techniques. These values drive our decisions and actions, and are essential to what makes us human. This motivated us to introduce new interaction methods that will support human values, particularly human well-being.

The aim of this thesis is to design new interaction methods that will empower human to have a healthy, intuitive, and pleasurable interaction with tomorrow’s digital world. In order to achieve this aim, this research is concerned with developing theories and techniques for exploring interaction methods beyond keyboard and mouse, utilizing human body. Therefore, this thesis addresses a very fundamental problem, human motion analysis.

Technical contributions of this thesis introduce computer vision-based, marker-less systems to estimate and analyze body motion. The main focus of this research work is on head and hand motion analysis due to the fact that they are the most frequently used body parts for interacting with computers. This thesis gives an insight into the technical challenges and provides new perspectives and robust techniques for solving the problem.

Place, publisher, year, edition, pages
Umeå: Umeå Universitet , 2014. , 70 p.
Series
Digital Media Lab, ISSN 1652-6295 ; 19
Keyword [en]
Human Well-Being, Bodily Interaction, Natural Interaction, Human Motion Analysis, Active Motion Estimation, Direct Motion Estimation, Head Pose Estimation, Hand Pose Estimation.
National Category
Signal Processing
Research subject
Signal Processing; Computerized Image Analysis
Identifiers
URN: urn:nbn:se:umu:diva-88508ISBN: 978-91-7601-067-9 (print)OAI: oai:DiVA.org:umu-88508DiVA: diva2:716122
Public defence
2014-06-04, Naturvetarhuset, N420, Umeå universitet, Umeå, 13:00 (English)
Opponent
Supervisors
Available from: 2014-05-14 Created: 2014-05-08 Last updated: 2014-05-13Bibliographically approved
List of papers
1. Camera-based gesture tracking for 3D interaction behind mobile devices
Open this publication in new window or tab >>Camera-based gesture tracking for 3D interaction behind mobile devices
2012 (English)In: International journal of pattern recognition and artificial intelligence, ISSN 0218-0014, Vol. 26, no 8, 1260008- p.Article in journal (Refereed) Published
Abstract [en]

Number of mobile devices such as Smartphones or Tablet PCs has been dramatically increased over the recent years. New mobile devices are equipped with integrated cameras and large displays that make the interaction with the device easier and more efficient. Although most of the previous works on interaction between humans and mobile devices are based on 2D touch-screen displays, camera-based interaction opens a new way to manipulate in 3D space behind the device in the camera's field of view. In this paper, our gestural interaction relies on particular patterns from local orientation of the image called rotational symmetries. This approach is based on finding the most suitable pattern from a large set of rotational symmetries of diffrerent orders that ensures a reliable detector for fingertips and user's gesture. Consequently, gesture detection and tracking can be used as an efficient tool for 3D manipulation in various virtual/augmented reality applications.

Keyword
3D mobile interaction, rotational symmetries, gesture detection, tracking, mobile applications
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:umu:diva-67979 (URN)10.1142/S0218001412600087 (DOI)000315523100006 ()
Available from: 2013-04-09 Created: 2013-04-09 Last updated: 2017-12-06Bibliographically approved
2. Real 3D Interaction Behind Mobile Phones for Augmented Environments
Open this publication in new window or tab >>Real 3D Interaction Behind Mobile Phones for Augmented Environments
2011 (English)In: 2011 IEEE International Conference on Multimedia and Expo (ICME), IEEE conference proceedings, 2011, 1-6 p.Conference paper, Published paper (Refereed)
Abstract [en]

Number of mobile devices such as mobile phones or PDAs has been dramatically increased over the recent years. New mobile devices are equipped with integrated cameras and large displays which make the interaction with device easier and more efficient. Although most of the previous works on interaction between humans and mobile devices are based on 2D touch-screen displays, camera-based interaction opens a new way to manipulate in 3D space behind the device in the camera's field of view. This paper suggests the use of particular patterns from local orientation of the image called Rotational Symmetries to detect and localize human gesture. Relative rotation and translation of human gesture between consecutive frames are estimated by means of extracting stable features. Consequently, this information can be used to facilitate the 3D manipulation of virtual objects in various applications in mobile devices.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2011
Series
IEEE International Conference on Multimedia and Expo, ISSN 1945-7871
Keyword
Mobile interaction, rotational symmetries, SIFT, 3D manipulation
National Category
Interaction Technologies
Identifiers
urn:nbn:se:umu:diva-52816 (URN)10.1109/ICME.2011.6012155 (DOI)000304354700154 ()978-1-61284-349-0 (ISBN)
Conference
Multimedia and Expo (ICME), 2011 IEEE International Conference on, Barcelona, Spain, July 11-15, 2011
Available from: 2012-03-02 Created: 2012-03-02 Last updated: 2017-01-16Bibliographically approved
3. 3D Active Human Motion Estimation for Biomedical Applications
Open this publication in new window or tab >>3D Active Human Motion Estimation for Biomedical Applications
2012 (English)In: World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China / [ed] Mian Long, Springer Berlin/Heidelberg, 2012, , 4 p.1014-1017 p.Conference paper, Published paper (Refereed)
Abstract [en]

Movement disorders forbid many people from enjoying their daily lives. As with other diseases, diagnosis and analysis are key issues in treating such disorders. Computer vision-based motion capture systems are helpful tools for accomplishing this task. However Classical motion tracking systems suffer from several limitations. First they are not cost effective. Second these systems cannot detect minute motions accurately. Finally they are spatially limited to the lab environment where the system is installed. In this project, we propose an innovative solution to solve the above-mentioned issues. Mounting the camera on human body, we build a convenient, low cost motion capture system that can be used by the patient while practicing daily-life activities. We refer to this system as active motion capture, which is not confined to the lab environment. Real-time experiments in our lab revealed the robustness and accuracy of the system.

Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2012. 4 p.
Series
IFMBE Proceedings, ISSN 1680-0737 ; 39
Keyword
Active motion tracking, Human motion analysis, Movement disorder, SIFT
National Category
Biomedical Laboratory Science/Technology
Identifiers
urn:nbn:se:umu:diva-55831 (URN)10.1007/978-3-642-29305-4_266 (DOI)978-364229304-7 (print) (ISBN)978-3-642-29305-4 (ISBN)
Conference
World Congress on Medical Physics and Biomedical Engineering (WC 2012), Beijing, China, 26-31 May 2012
Available from: 2012-06-04 Created: 2012-06-04 Last updated: 2014-05-21Bibliographically approved
4. Direct three-dimensional head pose estimation from Kinect-type sensors
Open this publication in new window or tab >>Direct three-dimensional head pose estimation from Kinect-type sensors
2014 (English)In: Electronics Letters, ISSN 0013-5194, E-ISSN 1350-911X, Vol. 50, no 4, 268-270 p.Article in journal, Letter (Refereed) Published
Abstract [en]

A direct method for recovering three-dimensional (3D) head motion parameters from a sequence of range images acquired by Kinect sensors is presented. Based on the range images, a new version of the optical flow constraint equation is derived, which can be used to directly estimate 3D motion parameters without any need of imposing other constraints. Since all calculations with the new constraint equation are based on the range images, Z(xyt), the existing techniques and experiences developed and accumulated on the topic of motion from optical flow can be directly applied simply by treating the range images as normal intensity images I(xyt). In this reported work, it is demonstrated how to employ the new optical flow constraint equation to recover the 3D motion of a moving head from the sequences of range images, and furthermore, how to use an old trick to handle the case when the optical flow is large. It is shown, in the end, that the performance of the proposed approach is comparable with that of some of the state-of-the-art approaches that use range data to recover 3D motion parameters.

National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-86584 (URN)10.1049/el.2013.2489 (DOI)000331405200019 ()
Available from: 2014-03-06 Created: 2014-03-02 Last updated: 2017-12-05Bibliographically approved
5. Head operated electric wheelchair
Open this publication in new window or tab >>Head operated electric wheelchair
2014 (English)In: IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI 2014), IEEE , 2014, 53-56 p.Conference paper, Published paper (Refereed)
Abstract [en]

Currently, the most common way to control an electric wheelchair is to use joystick. However, there are some individuals unable to operate joystick-driven electric wheelchairs due to sever physical disabilities, like quadriplegia patients. This paper proposes a novel head pose estimation method to assist such patients. Head motion parameters are employed to control and drive an electric wheelchair. We introduce a direct method for estimating user head motion, based on a sequence of range images captured by Kinect. In this work, we derive new version of the optical flow constraint equation for range images. We show how the new equation can be used to estimate head motion directly. Experimental results reveal that the proposed system works with high accuracy in real-time. We also show simulation results for navigating the electric wheelchair by recovering user head motion.

Place, publisher, year, edition, pages
IEEE, 2014
Series
IEEE Southwest Symposium on Image Analysis and Interpretation, ISSN 1550-5782
National Category
Signal Processing Computer Science
Identifiers
urn:nbn:se:umu:diva-86746 (URN)000355255900014 ()978-1-4799-4053-0 (ISBN)
Conference
IEEE Southwest Symposium on Image Analysis and Interpretation SSIAI
Available from: 2014-03-06 Created: 2014-03-06 Last updated: 2016-02-23Bibliographically approved
6. Direct hand pose estimation for immersive gestural interaction
Open this publication in new window or tab >>Direct hand pose estimation for immersive gestural interaction
Show others...
2015 (English)In: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 66, 91-99 p.Article in journal (Refereed) Published
Abstract [en]

This paper presents a novel approach for performing intuitive gesture based interaction using depth data acquired by Kinect. The main challenge to enable immersive gestural interaction is dynamic gesture recognition. This problem can be formulated as a combination of two tasks; gesture recognition and gesture pose estimation. Incorporation of fast and robust pose estimation method would lessen the burden to a great extent. In this paper we propose a direct method for real-time hand pose estimation. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Extensive experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation On two different setups; desktop computing, and mobile platform. This reveals the system capability to accommodate different interaction procedures. In addition, a user study is conducted to evaluate learnability, user experience and interaction quality in 3D gestural interaction in comparison to 2D touchscreen interaction.

Keyword
Immersive gestural interaction, Dynamic gesture recognition, Hand pose estimation
National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-86748 (URN)10.1016/j.patrec.2015.03.013 (DOI)000362271100011 ()
Available from: 2014-03-06 Created: 2014-03-06 Last updated: 2017-12-05Bibliographically approved
7. Telelife: An Immersive Media Experience for Rehabilitation
Open this publication in new window or tab >>Telelife: An Immersive Media Experience for Rehabilitation
2014 (English)Conference paper, Published paper (Refereed)
National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-88093 (URN)
Conference
Asia-Pacific Signal and Information Processing Association Annual Summit and Conference
Available from: 2014-04-23 Created: 2014-04-23 Last updated: 2014-09-20

Open Access in DiVA

fulltext(4331 kB)514 downloads
File information
File name FULLTEXT01.pdfFile size 4331 kBChecksum SHA-512
e1627e91765a6d9a5ffe1d95942c866f788dbdbfe94f506f04fdb32d224616c88fecd36fd854c69b18f9aab9e844066889db84ccde97a2f81e0e71170023189c
Type fulltextMimetype application/pdf

Authority records BETA

Abedan Kondori, Farid

Search in DiVA

By author/editor
Abedan Kondori, Farid
By organisation
Department of Applied Physics and Electronics
Signal Processing

Search outside of DiVA

GoogleGoogle Scholar
Total: 514 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 2685 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf