umu.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Abedan Kondori, Farid
Alternative names
Publications (10 of 22) Show all publications
Abedan Kondori, F., Yousefi, S., Kouma, J.-P., Liu, L. & Li, H. (2015). Direct hand pose estimation for immersive gestural interaction. Pattern Recognition Letters, 66, 91-99
Open this publication in new window or tab >>Direct hand pose estimation for immersive gestural interaction
Show others...
2015 (English)In: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 66, p. 91-99Article in journal (Refereed) Published
Abstract [en]

This paper presents a novel approach for performing intuitive gesture based interaction using depth data acquired by Kinect. The main challenge to enable immersive gestural interaction is dynamic gesture recognition. This problem can be formulated as a combination of two tasks; gesture recognition and gesture pose estimation. Incorporation of fast and robust pose estimation method would lessen the burden to a great extent. In this paper we propose a direct method for real-time hand pose estimation. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Extensive experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation On two different setups; desktop computing, and mobile platform. This reveals the system capability to accommodate different interaction procedures. In addition, a user study is conducted to evaluate learnability, user experience and interaction quality in 3D gestural interaction in comparison to 2D touchscreen interaction.

Keywords
Immersive gestural interaction, Dynamic gesture recognition, Hand pose estimation
National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-86748 (URN)10.1016/j.patrec.2015.03.013 (DOI)000362271100011 ()
Available from: 2014-03-06 Created: 2014-03-06 Last updated: 2018-06-08Bibliographically approved
Abedan Kondori, F., Yousefi, S., Ostovar, A., Liu, L. & Li, H. (2014). A Direct Method for 3D Hand Pose Recovery. In: 22nd International Conference on Pattern Recognition: . Paper presented at 22ND International Conference on Pattern Recognition (ICPR, 24–28 August 2014, Stockholm, Sweden (pp. 345-350).
Open this publication in new window or tab >>A Direct Method for 3D Hand Pose Recovery
Show others...
2014 (English)In: 22nd International Conference on Pattern Recognition, 2014, p. 345-350Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents a novel approach for performing intuitive 3D gesture-based interaction using depth data acquired by Kinect. Unlike current depth-based systems that focus only on classical gesture recognition problem, we also consider 3D gesture pose estimation for creating immersive gestural interaction. In this paper, we formulate gesture-based interaction system as a combination of two separate problems, gesture recognition and gesture pose estimation. We focus on the second problem and propose a direct method for recovering hand motion parameters. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Our experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation. This application is intended to explore the system capabilities in real-time biomedical applications. Eventually, system usability test is conducted to evaluate the learnability, user experience and interaction quality in 3D interaction in comparison to 2D touch-screen interaction.

Series
International Conference on Pattern Recognition, ISSN 1051-4651
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:umu:diva-108475 (URN)10.1109/ICPR.2014.68 (DOI)000359818000057 ()978-1-4799-5208-3 (ISBN)
Conference
22ND International Conference on Pattern Recognition (ICPR, 24–28 August 2014, Stockholm, Sweden
Available from: 2015-09-14 Created: 2015-09-11 Last updated: 2019-11-11Bibliographically approved
Abedan Kondori, F. (2014). Bring Your Body into Action: Body Gesture Detection, Tracking, and Analysis for Natural Interaction. (Doctoral dissertation). Umeå: Umeå Universitet
Open this publication in new window or tab >>Bring Your Body into Action: Body Gesture Detection, Tracking, and Analysis for Natural Interaction
2014 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Due to the large influx of computers in our daily lives, human-computer interaction has become crucially important. For a long time, focusing on what users need has been critical for designing interaction methods. However, new perspective tends to extend this attitude to encompass how human desires, interests, and ambitions can be met and supported. This implies that the way we interact with computers should be revisited. Centralizing human values rather than user needs is of the utmost importance for providing new interaction techniques. These values drive our decisions and actions, and are essential to what makes us human. This motivated us to introduce new interaction methods that will support human values, particularly human well-being.

The aim of this thesis is to design new interaction methods that will empower human to have a healthy, intuitive, and pleasurable interaction with tomorrow’s digital world. In order to achieve this aim, this research is concerned with developing theories and techniques for exploring interaction methods beyond keyboard and mouse, utilizing human body. Therefore, this thesis addresses a very fundamental problem, human motion analysis.

Technical contributions of this thesis introduce computer vision-based, marker-less systems to estimate and analyze body motion. The main focus of this research work is on head and hand motion analysis due to the fact that they are the most frequently used body parts for interacting with computers. This thesis gives an insight into the technical challenges and provides new perspectives and robust techniques for solving the problem.

Place, publisher, year, edition, pages
Umeå: Umeå Universitet, 2014. p. 70
Series
Digital Media Lab, ISSN 1652-6295 ; 19
Keywords
Human Well-Being, Bodily Interaction, Natural Interaction, Human Motion Analysis, Active Motion Estimation, Direct Motion Estimation, Head Pose Estimation, Hand Pose Estimation.
National Category
Signal Processing
Research subject
Signal Processing; Computerized Image Analysis
Identifiers
urn:nbn:se:umu:diva-88508 (URN)978-91-7601-067-9 (ISBN)
Public defence
2014-06-04, Naturvetarhuset, N420, Umeå universitet, Umeå, 13:00 (English)
Opponent
Supervisors
Available from: 2014-05-14 Created: 2014-05-08 Last updated: 2018-06-07Bibliographically approved
Abedan Kondori, F., Yousefi, S. & Li, H. (2014). Direct three-dimensional head pose estimation from Kinect-type sensors [Letter to the editor]. Electronics Letters, 50(4), 268-270
Open this publication in new window or tab >>Direct three-dimensional head pose estimation from Kinect-type sensors
2014 (English)In: Electronics Letters, ISSN 0013-5194, E-ISSN 1350-911X, Vol. 50, no 4, p. 268-270Article in journal, Letter (Refereed) Published
Abstract [en]

A direct method for recovering three-dimensional (3D) head motion parameters from a sequence of range images acquired by Kinect sensors is presented. Based on the range images, a new version of the optical flow constraint equation is derived, which can be used to directly estimate 3D motion parameters without any need of imposing other constraints. Since all calculations with the new constraint equation are based on the range images, Z(xyt), the existing techniques and experiences developed and accumulated on the topic of motion from optical flow can be directly applied simply by treating the range images as normal intensity images I(xyt). In this reported work, it is demonstrated how to employ the new optical flow constraint equation to recover the 3D motion of a moving head from the sequences of range images, and furthermore, how to use an old trick to handle the case when the optical flow is large. It is shown, in the end, that the performance of the proposed approach is comparable with that of some of the state-of-the-art approaches that use range data to recover 3D motion parameters.

National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-86584 (URN)10.1049/el.2013.2489 (DOI)000331405200019 ()
Available from: 2014-03-06 Created: 2014-03-02 Last updated: 2018-06-08Bibliographically approved
Abedan Kondori, F., Yousefi, S., Liu, L. & Li, H. (2014). Head operated electric wheelchair. In: IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI 2014): . Paper presented at IEEE Southwest Symposium on Image Analysis and Interpretation SSIAI (pp. 53-56). IEEE
Open this publication in new window or tab >>Head operated electric wheelchair
2014 (English)In: IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI 2014), IEEE , 2014, p. 53-56Conference paper, Published paper (Refereed)
Abstract [en]

Currently, the most common way to control an electric wheelchair is to use joystick. However, there are some individuals unable to operate joystick-driven electric wheelchairs due to sever physical disabilities, like quadriplegia patients. This paper proposes a novel head pose estimation method to assist such patients. Head motion parameters are employed to control and drive an electric wheelchair. We introduce a direct method for estimating user head motion, based on a sequence of range images captured by Kinect. In this work, we derive new version of the optical flow constraint equation for range images. We show how the new equation can be used to estimate head motion directly. Experimental results reveal that the proposed system works with high accuracy in real-time. We also show simulation results for navigating the electric wheelchair by recovering user head motion.

Place, publisher, year, edition, pages
IEEE, 2014
Series
IEEE Southwest Symposium on Image Analysis and Interpretation, ISSN 1550-5782
National Category
Signal Processing Computer Sciences
Identifiers
urn:nbn:se:umu:diva-86746 (URN)000355255900014 ()978-1-4799-4053-0 (ISBN)
Conference
IEEE Southwest Symposium on Image Analysis and Interpretation SSIAI
Available from: 2014-03-06 Created: 2014-03-06 Last updated: 2018-06-08Bibliographically approved
Yousefi, S., Kondori, F. A. & Li, H. (2014). Interactive 3D visualization on a 4K wall-sized display. In: Proceedings of 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014): . Paper presented at 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), 9–12 December 2014, Chiang Mai, Thailand. IEEE
Open this publication in new window or tab >>Interactive 3D visualization on a 4K wall-sized display
2014 (English)In: Proceedings of 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), IEEE, 2014Conference paper, Published paper (Refereed)
Abstract [en]

This paper introduces a novel vision-based approach for realistic interaction between user and display's content. An extremely accurate motion capture system is proposed to measure and track the user's head motion in 3D space. Video frames captured by the low-cost head-mounted camera are processed to retrieve the 3D motion parameters. The retrieved information facilitates the real-time 3D interaction. This technology turns any 2D screen to interactive 3D display, enabling users to control and manipulate the content as a digital window. The proposed system is tested and verified on a huge wall-sized 4K screen.

Place, publisher, year, edition, pages
IEEE, 2014
National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-101445 (URN)10.1109/APSIPA.2014.7041653 (DOI)000392861900141 ()978-616-361-823-8 (ISBN)
Conference
2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), 9–12 December 2014, Chiang Mai, Thailand
Available from: 2015-03-30 Created: 2015-03-30 Last updated: 2018-06-07Bibliographically approved
Kondori, F. A., Liu, L. & Li, H. (2014). Telelife: An immersive media experience for rehabilitation. In: Proceedings of the 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014): . Paper presented at 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), 9-12 December 2014, Chiang Mai, Thailand. IEEE
Open this publication in new window or tab >>Telelife: An immersive media experience for rehabilitation
2014 (English)In: Proceedings of the 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), IEEE, 2014Conference paper, Published paper (Refereed)
Abstract [en]

In recent years, emergence of telerehabilitation systems for home-based therapy has altered healthcare systems. Telerehabilitation enables therapists to observe patients status via Internet, thus a patient does not have to visit rehabilitation facilities for every rehabilitation session. Despite the fact that telerehabilitation provides great opportunities, there are two major issues that affect effectiveness of telerehabilitation: relegation of the patient at home, and loss of direct supervision of the therapist. Since patients have no actual interaction with other persons during the rehabilitation period, they will become isolated and gradually lose their social skills. Moreover, without direct supervision of therapists, rehabilitation exercises can be performed with bad compensation strategies that lead to a poor quality recovery. To resolve these issues, we propose telelife, a new concept for future rehabilitation systems. The idea is to use media technology to create a totally new immersive media experience for rehabilitation. In telerehabilitation patients locally execute exercises, and therapists remotely monitor patients' status. In telelife patients, however, remotely perform exercises and therapists locally monitor. Thus, not only telelife enables rehabilitation at distance, but also improves the patients' social competences, and provides direct supervision of therapists. In this paper we introduce telelife to enhance telerehabilitation, and investigate technical challenges and possible methods to achieve telelife.

Place, publisher, year, edition, pages
IEEE, 2014
National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-88093 (URN)10.1109/APSIPA.2014.7041675 (DOI)000392861900163 ()978-6-1636-1823-8 (ISBN)
Conference
2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), 9-12 December 2014, Chiang Mai, Thailand
Available from: 2014-04-23 Created: 2014-04-23 Last updated: 2018-06-08Bibliographically approved
Abedan Kondori, F., Yousefi, S. & Liu, L. (2013). Active human gesture capture for diagnosing and treating movement disorders. In: : . Paper presented at Proceeding of The Swedish Symposium on Image Analysis (SSBA2013), Gothenburg, Sweden.
Open this publication in new window or tab >>Active human gesture capture for diagnosing and treating movement disorders
2013 (English)Conference paper, Published paper (Other academic)
Abstract [en]

Movement disorders prevent many people fromenjoying their daily lives. As with other diseases, diagnosisand analysis are key issues in treating such disorders.Computer vision-based motion capture systems are helpfultools for accomplishing this task. However Classical motiontracking systems suffer from several limitations. First theyare not cost effective. Second these systems cannot detectminute motions accurately. Finally they are spatially limitedto the lab environment where the system is installed. In thisproject, we propose an innovative solution to solve the abovementionedissues. Mounting the camera on human body, webuild a convenient, low cost motion capture system that canbe used by the patient in daily-life activities. We refer tothis system as active motion capture, which is not confinedto the lab environment. Real-time experiments in our labrevealed the robustness and accuracy of the system.

National Category
Other Medical Engineering
Identifiers
urn:nbn:se:umu:diva-83352 (URN)
Conference
Proceeding of The Swedish Symposium on Image Analysis (SSBA2013), Gothenburg, Sweden
Available from: 2013-11-22 Created: 2013-11-22 Last updated: 2018-06-08Bibliographically approved
Yousefi, S., Kondori, F. A. & Li, H. (2013). Experiencing real 3D gestural interaction with mobile devices. Pattern Recognition Letters, 34(8), 912-921
Open this publication in new window or tab >>Experiencing real 3D gestural interaction with mobile devices
2013 (English)In: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 34, no 8, p. 912-921Article in journal (Refereed) Published
Abstract [en]

Number of mobile devices such as smart phones or Tablet PCs has been dramatically increased over the recent years. New mobile devices are equipped with integrated cameras and large displays which make the interaction with the device more efficient. Although most of the previous works on interaction between humans and mobile devices are based on 2D touch-screen displays, camera-based interaction opens a new way to manipulate in 3D space behind the device, in the camera's field of view. In this paper, our gestural interaction heavily relies on particular patterns from local orientation of the image called Rotational Symmetries. This approach is based on finding the most suitable pattern from a large set of rotational symmetries of different orders that ensures a reliable detector for hand gesture. Consequently, gesture detection and tracking can be hired as an efficient tool for 3D manipulation in various applications in computer vision and augmented reality. The final output will be rendered into color anaglyphs for 3D visualization. Depending on the coding technology, different low cost 3D glasses can be used for the viewers. (C) 2013 Elsevier B.V. All rights reserved.

Keywords
3D mobile interaction, Rotational symmetries, Gesture detection, SIFT, Gesture tracking, stereoscopic visualization
National Category
Interaction Technologies
Identifiers
urn:nbn:se:umu:diva-71586 (URN)10.1016/j.patrec.2013.02.004 (DOI)000318129200010 ()
Available from: 2013-06-05 Created: 2013-06-04 Last updated: 2018-06-08Bibliographically approved
Abedan Kondori, F. & Liu, L. (2012). 3D Active Human Motion Estimation for Biomedical Applications. In: Mian Long (Ed.), World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China: . Paper presented at World Congress on Medical Physics and Biomedical Engineering (WC 2012), Beijing, China, 26-31 May 2012 (pp. 1014-1017). Springer Berlin/Heidelberg
Open this publication in new window or tab >>3D Active Human Motion Estimation for Biomedical Applications
2012 (English)In: World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China / [ed] Mian Long, Springer Berlin/Heidelberg, 2012, , p. 4p. 1014-1017Conference paper, Published paper (Refereed)
Abstract [en]

Movement disorders forbid many people from enjoying their daily lives. As with other diseases, diagnosis and analysis are key issues in treating such disorders. Computer vision-based motion capture systems are helpful tools for accomplishing this task. However Classical motion tracking systems suffer from several limitations. First they are not cost effective. Second these systems cannot detect minute motions accurately. Finally they are spatially limited to the lab environment where the system is installed. In this project, we propose an innovative solution to solve the above-mentioned issues. Mounting the camera on human body, we build a convenient, low cost motion capture system that can be used by the patient while practicing daily-life activities. We refer to this system as active motion capture, which is not confined to the lab environment. Real-time experiments in our lab revealed the robustness and accuracy of the system.

Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2012. p. 4
Series
IFMBE Proceedings, ISSN 1680-0737 ; 39
Keywords
Active motion tracking, Human motion analysis, Movement disorder, SIFT
National Category
Biomedical Laboratory Science/Technology
Identifiers
urn:nbn:se:umu:diva-55831 (URN)10.1007/978-3-642-29305-4_266 (DOI)978-364229304-7 (print) (ISBN)978-3-642-29305-4 (ISBN)
Conference
World Congress on Medical Physics and Biomedical Engineering (WC 2012), Beijing, China, 26-31 May 2012
Available from: 2012-06-04 Created: 2012-06-04 Last updated: 2018-06-08Bibliographically approved
Organisations

Search in DiVA

Show all publications