umu.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Li, Haibo
Publications (10 of 126) Show all publications
Li, B., Li, H. & Söderström, U. (2016). Distinctive curve features. Electronics Letters, 52(3), 197-198.
Open this publication in new window or tab >>Distinctive curve features
2016 (English)In: Electronics Letters, ISSN 0013-5194, E-ISSN 1350-911X, Vol. 52, no 3, 197-198 p.Article in journal (Other academic) Published
Abstract [en]

Curves and lines are geometrical, abstract features of an image. Whereas interest points are more limited, curves and lines provide much more information of the image structure. However, the research done in curve and line detection is very fragmented. The concept of scale space is not yet fused very well into curve and line detection. Keypoint (e.g. SIFT, SURF, ORB) is a successful concept which represent features (e.g. blob, corner etc.) in scale space. Stimulated by the keypoint concept, a method which extracts distinctive curves (DICU) in scale space, including lines as a special form of curve features is proposed. A curve feature can be represented by three keypoints (two end points, and one middle point). A good way to test the quality of detected curves is to analyse the repeatability under various image transformations. DICU using the standard Oxford benchmark is evaluated. The overlap error is calculated by averaging the overlap error of three keypoints on the curve. Experiment results show that DICU achieves good repeatability comparing with other state-of-the-art methods. To match curve features, a relatively uncomplicated way is to combine local descriptors of three keypoints on each curve.

Keyword
curve detection, line detection, feature matching
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Research subject
Signal Processing
Identifiers
urn:nbn:se:umu:diva-111184 (URN)10.1049/el.2015.3495 (DOI)000369674000014 ()
Available from: 2015-11-06 Created: 2015-11-06 Last updated: 2017-12-01Bibliographically approved
Khan, M. S., Li, H. & ur Réhman, S. (2016). Telepresence Mechatronic Robot (TEBoT): Towards the design and control of socially interactive bio-inspired system. Journal of Intelligent & Fuzzy Systems, 31(5), 2597-2610.
Open this publication in new window or tab >>Telepresence Mechatronic Robot (TEBoT): Towards the design and control of socially interactive bio-inspired system
2016 (English)In: Journal of Intelligent & Fuzzy Systems, ISSN 1064-1246, E-ISSN 1875-8967, Vol. 31, no 5, 2597-2610 p.Article in journal (Refereed) Published
Abstract [en]

Socially interactive systems are embodied agents that engage in social interactions with humans. From a design perspective, these systems are built by considering a biologically inspired design (Bio-inspired) that can mimic and simulate human-like communication cues and gestures. The design of a bio-inspired system usually consists of (i) studying biological characteristics, (ii) designing a similar biological robot, and (iii) motion planning, that can mimic the biological counterpart. In this article, we present a design, development, control-strategy and verification of our socially interactive bio-inspired robot, namely - Telepresence Mechatronic Robot (TEBoT). The key contribution of our work is an embodiment of a real human-neck movements by, i) designing a mechatronic platform based on the dynamics of a real human neck and ii) capturing the real head movements through our novel single-camera based vision algorithm. Our socially interactive bio-inspired system is based on an intuitive integration-design strategy that combines computer vision based geometric head pose estimation algorithm, model based design (MBD) approach and real-time motion planning techniques. We have conducted an extensive testing to demonstrate effectiveness and robustness of our proposed system.

Keyword
Socially interactive robot, biologically inspired robot, head pose estimation, vision based robot control, model based design, embodied telepresence system
National Category
Robotics Computer Vision and Robotics (Autonomous Systems) Interaction Technologies
Identifiers
urn:nbn:se:umu:diva-108552 (URN)10.3233/JIFS-169100 (DOI)000386532000015 ()
Available from: 2015-09-14 Created: 2015-09-14 Last updated: 2018-01-11Bibliographically approved
Halawani, A., ur Réhman, S. & Li, H. (2015). Active Vision for Tremor Disease Monitoring. Paper presented at 6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015). Procedia Manufacturing, 3, 2042-2048.
Open this publication in new window or tab >>Active Vision for Tremor Disease Monitoring
2015 (English)In: Procedia Manufacturing, ISSN 1873-6580, E-ISSN 2212-0173, Vol. 3, 2042-2048 p.Article in journal (Refereed) Published
Abstract [en]

The aim of this work is to introduce a prototype for monitoring tremor diseases using computer vision techniques.  While vision has been previously used for this purpose, the system we are introducing differs intrinsically from other traditional systems. The essential difference is characterized by the placement of the camera on the user’s body rather than in front of it, and thus reversing the whole process of motion estimation. This is called active motion tracking. Active vision is simpler in setup and achieves more accurate results compared to traditional arrangements, which we refer to as “passive” here. One main advantage of active tracking is its ability to detect even tiny motions using its simple setup, and that makes it very suitable for monitoring tremor disorders. 

Keyword
Active vision, Tremors, SIFT, Motion estimation, Motion tracking
National Category
Computer Vision and Robotics (Autonomous Systems) Computer Sciences
Identifiers
urn:nbn:se:umu:diva-109206 (URN)10.1016/j.promfg.2015.07.252 (DOI)
Conference
6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015)
Available from: 2015-09-22 Created: 2015-09-22 Last updated: 2018-01-11Bibliographically approved
Abedan Kondori, F., Yousefi, S., Kouma, J.-P., Liu, L. & Li, H. (2015). Direct hand pose estimation for immersive gestural interaction. Pattern Recognition Letters, 66, 91-99.
Open this publication in new window or tab >>Direct hand pose estimation for immersive gestural interaction
Show others...
2015 (English)In: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 66, 91-99 p.Article in journal (Refereed) Published
Abstract [en]

This paper presents a novel approach for performing intuitive gesture based interaction using depth data acquired by Kinect. The main challenge to enable immersive gestural interaction is dynamic gesture recognition. This problem can be formulated as a combination of two tasks; gesture recognition and gesture pose estimation. Incorporation of fast and robust pose estimation method would lessen the burden to a great extent. In this paper we propose a direct method for real-time hand pose estimation. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Extensive experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation On two different setups; desktop computing, and mobile platform. This reveals the system capability to accommodate different interaction procedures. In addition, a user study is conducted to evaluate learnability, user experience and interaction quality in 3D gestural interaction in comparison to 2D touchscreen interaction.

Keyword
Immersive gestural interaction, Dynamic gesture recognition, Hand pose estimation
National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-86748 (URN)10.1016/j.patrec.2015.03.013 (DOI)000362271100011 ()
Available from: 2014-03-06 Created: 2014-03-06 Last updated: 2017-12-05Bibliographically approved
Halawani, A. & Li, H. (2015). Human Ear Localization: A Template-based Approach. In: : . Paper presented at ICOPR 2015, International Workshop on Pattern Recognition, Dubai, May 4-5, 2015. .
Open this publication in new window or tab >>Human Ear Localization: A Template-based Approach
2015 (English)Conference paper, Published paper (Refereed)
Abstract [en]

We propose a simple and yet effective technique for shape-based ear localization. The idea is based on using a predefined binary ear template that is matched to ear contours in a given edge image. To cope with changes in ear shapes and sizes, the template is allowed to deform. Deformation is achieved by dividing the template into segments. The dynamic programming search algorithm is used to accomplish the matching process, achieving very robust localization results in various cluttered and noisy setups.

National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:umu:diva-109204 (URN)
Conference
ICOPR 2015, International Workshop on Pattern Recognition, Dubai, May 4-5, 2015
Available from: 2015-09-22 Created: 2015-09-22 Last updated: 2018-01-11
Lv, Z., Halawani, A., Feng, S., ur Réhman, S. & Li, H. (2015). Touch-less interactive augmented reality game on vision-based wearable device. Personal and Ubiquitous Computing, 19(3-4), 551-567.
Open this publication in new window or tab >>Touch-less interactive augmented reality game on vision-based wearable device
Show others...
2015 (English)In: Personal and Ubiquitous Computing, ISSN 1617-4909, E-ISSN 1617-4917, Vol. 19, no 3-4, 551-567 p.Article in journal (Refereed) Published
Abstract [en]

There is an increasing interest in creating pervasive games based on emerging interaction technologies. In order to develop touch-less, interactive and augmented reality games on vision-based wearable device, a touch-less motion interaction technology is designed and evaluated in this work. Users interact with the augmented reality games with dynamic hands/feet gestures in front of the camera, which triggers the interaction event to interact with the virtual object in the scene. Three primitive augmented reality games with eleven dynamic gestures are developed based on the proposed touch-less interaction technology as proof. At last, a comparing evaluation is proposed to demonstrate the social acceptability and usability of the touch-less approach, running on a hybrid wearable framework or with Google Glass, as well as workload assessment, user’s emotions and satisfaction.

Keyword
Wearable device, Smartphone game, Hand free, Pervasive game, Augmented reality game, Touch-less
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:umu:diva-109203 (URN)10.1007/s00779-015-0844-1 (DOI)000357471100006 ()
Available from: 2015-09-22 Created: 2015-09-22 Last updated: 2018-01-11Bibliographically approved
Yousefi, S., Li, H. & Liu, L. (2014). 3D Gesture Analysis Using a Large-Scale Gesture Database. In: Bebis, G; Boyle, R; Parvin, B; Koracin, D; McMahan, R; Jerald, J; Zhang, H; Drucker, SM; Kambhamettu, C; ElChoubassi, M; Deng, Z; Carlson, M (Ed.), Advances in Visual Computing: 10th International Symposium, ISVC 2014, Las Vegas, NV, USA, December 8-10, 2014, Proceedings, Part I. Paper presented at 10th International Symposium on Visual Computing (ISVC), DEC 08-10, 2014, Las Vegas, NV (pp. 206-217). .
Open this publication in new window or tab >>3D Gesture Analysis Using a Large-Scale Gesture Database
2014 (English)In: Advances in Visual Computing: 10th International Symposium, ISVC 2014, Las Vegas, NV, USA, December 8-10, 2014, Proceedings, Part I / [ed] Bebis, G; Boyle, R; Parvin, B; Koracin, D; McMahan, R; Jerald, J; Zhang, H; Drucker, SM; Kambhamettu, C; ElChoubassi, M; Deng, Z; Carlson, M, 2014, 206-217 p.Conference paper, Published paper (Refereed)
Abstract [en]

3D gesture analysis is a highly desired feature of future interaction design. Specifically, in augmented environments, intuitive interaction with the physical space seems unavoidable and 3D gestural interaction might be the most effective alternative for the current input facilities. This paper, introduces a novel solution for real-time 3D gesture analysis using an extremely large gesture database. This database includes the images of various articulated hand gestures with the annotated 3D position/orientation parameters of the hand joints. Our unique search algorithm is based on the hierarchical scoring of the low-level edge-orientation features between the query input and database and retrieving the best match. Once the best match is found from the database in real-time, the pre-calculated 3D parameters can instantly be used for gesture-based interaction.

Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 8887
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:umu:diva-106166 (URN)10.1007/978-3-319-14249-4_20 (DOI)000354694000020 ()978-3-319-14249-4 (ISBN)978-3-319-14248-7 (ISBN)
Conference
10th International Symposium on Visual Computing (ISVC), DEC 08-10, 2014, Las Vegas, NV
Available from: 2015-07-09 Created: 2015-07-09 Last updated: 2018-01-11Bibliographically approved
Abedan Kondori, F., Yousefi, S., Ostovar, A., Liu, L. & Li, H. (2014). A Direct Method for 3D Hand Pose Recovery. In: 22nd International Conference on Pattern Recognition: . Paper presented at 22ND International Conference on Pattern Recognition (ICPR, 24–28 August 2014, Stockholm, Sweden (pp. 345-350). .
Open this publication in new window or tab >>A Direct Method for 3D Hand Pose Recovery
Show others...
2014 (English)In: 22nd International Conference on Pattern Recognition, 2014, 345-350 p.Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents a novel approach for performing intuitive 3D gesture-based interaction using depth data acquired by Kinect. Unlike current depth-based systems that focus only on classical gesture recognition problem, we also consider 3D gesture pose estimation for creating immersive gestural interaction. In this paper, we formulate gesture-based interaction system as a combination of two separate problems, gesture recognition and gesture pose estimation. We focus on the second problem and propose a direct method for recovering hand motion parameters. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Our experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation. This application is intended to explore the system capabilities in real-time biomedical applications. Eventually, system usability test is conducted to evaluate the learnability, user experience and interaction quality in 3D interaction in comparison to 2D touch-screen interaction.

Series
International Conference on Pattern Recognition, ISSN 1051-4651
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:umu:diva-108475 (URN)10.1109/ICPR.2014.68 (DOI)000359818000057 ()978-1-4799-5208-3 (ISBN)
Conference
22ND International Conference on Pattern Recognition (ICPR, 24–28 August 2014, Stockholm, Sweden
Available from: 2015-09-14 Created: 2015-09-11 Last updated: 2016-05-27Bibliographically approved
Khan, M. S., ur Réhman, S., La Hera, P., Liu, F. & Li, H. (2014). A pilot user's prospective in mobile robotic telepresence system. In: 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014): . Paper presented at Annual Summit and Conference of Asia-Pacific-Signal-and-Information-Processing-Association (APSIPA), DEC 09-12, 2014, Angkor, CAMBODIA. IEEE.
Open this publication in new window or tab >>A pilot user's prospective in mobile robotic telepresence system
Show others...
2014 (English)In: 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), IEEE, 2014Conference paper, Published paper (Refereed)
Abstract [en]

In this work we present an interactive video conferencing system specifically designed for enhancing the experience of video teleconferencing for a pilot user. We have used an Embodied Telepresence System (ETS) which was previously designed to enhance the experience of video teleconferencing for the collaborators. In this work we have deployed an ETS in a novel scenario to improve the experience of pilot user during distance communication. The ETS is used to adjust the view of the pilot user at the distance location (e.g. distance located conference/meeting). The velocity profile control for the ETS is developed which is implicitly controlled by the head of the pilot user. The experiment was conducted to test whether the view adjustment capability of an ETS increases the collaboration experience of video conferencing for the pilot user or not. The user study was conducted in which participants (pilot users) performed interaction using ETS and with traditional computer based video conferencing tool. Overall, the user study suggests the effectiveness of our approach and hence results in enhancing the experience of video conferencing for the pilot user.

Place, publisher, year, edition, pages
IEEE, 2014
Keyword
Teleconferencing, Collaboration, Computers, Estimation, Noise, Computer science, Human factors
National Category
Robotics Interaction Technologies Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:umu:diva-108559 (URN)10.1109/APSIPA.2014.7041635 (DOI)000392861900123 ()978-6-1636-1823-8 (ISBN)
Conference
Annual Summit and Conference of Asia-Pacific-Signal-and-Information-Processing-Association (APSIPA), DEC 09-12, 2014, Angkor, CAMBODIA
Available from: 2015-09-14 Created: 2015-09-14 Last updated: 2018-01-11Bibliographically approved
Wu, J., Bisio, I., Gniady, C., Hossain, E., Valla, M. & Li, H. (2014). CONTEXT-AWARE NETWORKING AND COMMUNICATIONS: PART 1. IEEE Communications Magazine, 52(6), 14-15.
Open this publication in new window or tab >>CONTEXT-AWARE NETWORKING AND COMMUNICATIONS: PART 1
Show others...
2014 (English)In: IEEE Communications Magazine, ISSN 0163-6804, E-ISSN 1558-1896, Vol. 52, no 6, 14-15 p.Article in journal, Editorial material (Other academic) Published
National Category
Other Computer and Information Science
Identifiers
urn:nbn:se:umu:diva-91284 (URN)10.1109/MCOM.2014.6829939 (DOI)000338032800002 ()
Available from: 2014-07-28 Created: 2014-07-28 Last updated: 2018-01-11Bibliographically approved
Organisations

Search in DiVA

Show all publications