umu.sePublikasjoner
Endre søk
Link to record
Permanent link

Direct link
BETA
Liu, Li
Alternativa namn
Publikasjoner (10 av 27) Visa alla publikasjoner
Chen, M., Xia, D., Min, C., Zhao, X., Chen, Y., Liu, L. & Li, X. (2016). Neonatal repetitive pain in rats leads to impaired spatial learning and dysregulated hypothalamic-pituitary-adrenal axis function in later life. Scientific Reports, 6, Article ID 39159.
Åpne denne publikasjonen i ny fane eller vindu >>Neonatal repetitive pain in rats leads to impaired spatial learning and dysregulated hypothalamic-pituitary-adrenal axis function in later life
Vise andre…
2016 (engelsk)Inngår i: Scientific Reports, ISSN 2045-2322, E-ISSN 2045-2322, Vol. 6, artikkel-id 39159Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Preterm birth is a major health issue. As part of their life-saving care, most preterm infants require hospitalization and are inevitably exposed to repetitive skin-breaking procedures. The long-term effects of neonatal repetitive pain on cognitive and emotional behaviors involving hypothalamic-pituitary-adrenal (HPA) axis function in young and adult rats are unknown. From P8 to P85, mechanical hypersensitivity of the bilateral hindpaws was observed in the Needle group (P < 0.001). Compared with the Tactile group, the Needle group took longer to find the platform on P30 than on P29 (P = 0.03), with a decreased number of original platform site crossings during the probe trial of the Morris water maze test (P = 0.026). Moreover, the Needle group spent more time and took longer distances in the central area than the Tactile group in the Open-field test, both in prepubertal and adult rats (P < 0.05). The HPA axis function in the Needle group differed from the Tactile group (P < 0.05), with decreased stress responsiveness in prepuberty and puberty (P < 0.05) and increased stress responsiveness in adulthood (P < 0.05). This study indicates that repetitive pain that occurs during a critical period may cause severe consequences, with behavioral and neuroendocrine disturbances developing through prepuberty to adult life.

sted, utgiver, år, opplag, sider
Nature Publishing Group, 2016
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-129707 (URN)10.1038/srep39159 (DOI)000389814700001 ()
Tilgjengelig fra: 2017-01-10 Laget: 2017-01-09 Sist oppdatert: 2018-06-09bibliografisk kontrollert
Abedan Kondori, F., Yousefi, S., Kouma, J.-P., Liu, L. & Li, H. (2015). Direct hand pose estimation for immersive gestural interaction. Pattern Recognition Letters, 66, 91-99
Åpne denne publikasjonen i ny fane eller vindu >>Direct hand pose estimation for immersive gestural interaction
Vise andre…
2015 (engelsk)Inngår i: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 66, s. 91-99Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

This paper presents a novel approach for performing intuitive gesture based interaction using depth data acquired by Kinect. The main challenge to enable immersive gestural interaction is dynamic gesture recognition. This problem can be formulated as a combination of two tasks; gesture recognition and gesture pose estimation. Incorporation of fast and robust pose estimation method would lessen the burden to a great extent. In this paper we propose a direct method for real-time hand pose estimation. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Extensive experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation On two different setups; desktop computing, and mobile platform. This reveals the system capability to accommodate different interaction procedures. In addition, a user study is conducted to evaluate learnability, user experience and interaction quality in 3D gestural interaction in comparison to 2D touchscreen interaction.

Emneord
Immersive gestural interaction, Dynamic gesture recognition, Hand pose estimation
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-86748 (URN)10.1016/j.patrec.2015.03.013 (DOI)000362271100011 ()
Tilgjengelig fra: 2014-03-06 Laget: 2014-03-06 Sist oppdatert: 2018-06-08bibliografisk kontrollert
Yousefi, S., Li, H. & Liu, L. (2014). 3D Gesture Analysis Using a Large-Scale Gesture Database. In: Bebis, G; Boyle, R; Parvin, B; Koracin, D; McMahan, R; Jerald, J; Zhang, H; Drucker, SM; Kambhamettu, C; ElChoubassi, M; Deng, Z; Carlson, M (Ed.), Advances in Visual Computing: 10th International Symposium, ISVC 2014, Las Vegas, NV, USA, December 8-10, 2014, Proceedings, Part I. Paper presented at 10th International Symposium on Visual Computing (ISVC), DEC 08-10, 2014, Las Vegas, NV (pp. 206-217).
Åpne denne publikasjonen i ny fane eller vindu >>3D Gesture Analysis Using a Large-Scale Gesture Database
2014 (engelsk)Inngår i: Advances in Visual Computing: 10th International Symposium, ISVC 2014, Las Vegas, NV, USA, December 8-10, 2014, Proceedings, Part I / [ed] Bebis, G; Boyle, R; Parvin, B; Koracin, D; McMahan, R; Jerald, J; Zhang, H; Drucker, SM; Kambhamettu, C; ElChoubassi, M; Deng, Z; Carlson, M, 2014, s. 206-217Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

3D gesture analysis is a highly desired feature of future interaction design. Specifically, in augmented environments, intuitive interaction with the physical space seems unavoidable and 3D gestural interaction might be the most effective alternative for the current input facilities. This paper, introduces a novel solution for real-time 3D gesture analysis using an extremely large gesture database. This database includes the images of various articulated hand gestures with the annotated 3D position/orientation parameters of the hand joints. Our unique search algorithm is based on the hierarchical scoring of the low-level edge-orientation features between the query input and database and retrieving the best match. Once the best match is found from the database in real-time, the pre-calculated 3D parameters can instantly be used for gesture-based interaction.

Serie
Lecture Notes in Computer Science, ISSN 0302-9743 ; 8887
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-106166 (URN)10.1007/978-3-319-14249-4_20 (DOI)000354694000020 ()978-3-319-14249-4 (ISBN)978-3-319-14248-7 (ISBN)
Konferanse
10th International Symposium on Visual Computing (ISVC), DEC 08-10, 2014, Las Vegas, NV
Tilgjengelig fra: 2015-07-09 Laget: 2015-07-09 Sist oppdatert: 2018-06-07bibliografisk kontrollert
Abedan Kondori, F., Yousefi, S., Ostovar, A., Liu, L. & Li, H. (2014). A Direct Method for 3D Hand Pose Recovery. In: 22nd International Conference on Pattern Recognition: . Paper presented at 22ND International Conference on Pattern Recognition (ICPR, 24–28 August 2014, Stockholm, Sweden (pp. 345-350).
Åpne denne publikasjonen i ny fane eller vindu >>A Direct Method for 3D Hand Pose Recovery
Vise andre…
2014 (engelsk)Inngår i: 22nd International Conference on Pattern Recognition, 2014, s. 345-350Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

This paper presents a novel approach for performing intuitive 3D gesture-based interaction using depth data acquired by Kinect. Unlike current depth-based systems that focus only on classical gesture recognition problem, we also consider 3D gesture pose estimation for creating immersive gestural interaction. In this paper, we formulate gesture-based interaction system as a combination of two separate problems, gesture recognition and gesture pose estimation. We focus on the second problem and propose a direct method for recovering hand motion parameters. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Our experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation. This application is intended to explore the system capabilities in real-time biomedical applications. Eventually, system usability test is conducted to evaluate the learnability, user experience and interaction quality in 3D interaction in comparison to 2D touch-screen interaction.

Serie
International Conference on Pattern Recognition, ISSN 1051-4651
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-108475 (URN)10.1109/ICPR.2014.68 (DOI)000359818000057 ()978-1-4799-5208-3 (ISBN)
Konferanse
22ND International Conference on Pattern Recognition (ICPR, 24–28 August 2014, Stockholm, Sweden
Tilgjengelig fra: 2015-09-14 Laget: 2015-09-11 Sist oppdatert: 2019-11-11bibliografisk kontrollert
Abedan Kondori, F., Yousefi, S., Liu, L. & Li, H. (2014). Head operated electric wheelchair. In: IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI 2014): . Paper presented at IEEE Southwest Symposium on Image Analysis and Interpretation SSIAI (pp. 53-56). IEEE
Åpne denne publikasjonen i ny fane eller vindu >>Head operated electric wheelchair
2014 (engelsk)Inngår i: IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI 2014), IEEE , 2014, s. 53-56Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Currently, the most common way to control an electric wheelchair is to use joystick. However, there are some individuals unable to operate joystick-driven electric wheelchairs due to sever physical disabilities, like quadriplegia patients. This paper proposes a novel head pose estimation method to assist such patients. Head motion parameters are employed to control and drive an electric wheelchair. We introduce a direct method for estimating user head motion, based on a sequence of range images captured by Kinect. In this work, we derive new version of the optical flow constraint equation for range images. We show how the new equation can be used to estimate head motion directly. Experimental results reveal that the proposed system works with high accuracy in real-time. We also show simulation results for navigating the electric wheelchair by recovering user head motion.

sted, utgiver, år, opplag, sider
IEEE, 2014
Serie
IEEE Southwest Symposium on Image Analysis and Interpretation, ISSN 1550-5782
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-86746 (URN)000355255900014 ()978-1-4799-4053-0 (ISBN)
Konferanse
IEEE Southwest Symposium on Image Analysis and Interpretation SSIAI
Tilgjengelig fra: 2014-03-06 Laget: 2014-03-06 Sist oppdatert: 2018-06-08bibliografisk kontrollert
Kondori, F. A., Liu, L. & Li, H. (2014). Telelife: An immersive media experience for rehabilitation. In: Proceedings of the 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014): . Paper presented at 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), 9-12 December 2014, Chiang Mai, Thailand. IEEE
Åpne denne publikasjonen i ny fane eller vindu >>Telelife: An immersive media experience for rehabilitation
2014 (engelsk)Inngår i: Proceedings of the 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), IEEE, 2014Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

In recent years, emergence of telerehabilitation systems for home-based therapy has altered healthcare systems. Telerehabilitation enables therapists to observe patients status via Internet, thus a patient does not have to visit rehabilitation facilities for every rehabilitation session. Despite the fact that telerehabilitation provides great opportunities, there are two major issues that affect effectiveness of telerehabilitation: relegation of the patient at home, and loss of direct supervision of the therapist. Since patients have no actual interaction with other persons during the rehabilitation period, they will become isolated and gradually lose their social skills. Moreover, without direct supervision of therapists, rehabilitation exercises can be performed with bad compensation strategies that lead to a poor quality recovery. To resolve these issues, we propose telelife, a new concept for future rehabilitation systems. The idea is to use media technology to create a totally new immersive media experience for rehabilitation. In telerehabilitation patients locally execute exercises, and therapists remotely monitor patients' status. In telelife patients, however, remotely perform exercises and therapists locally monitor. Thus, not only telelife enables rehabilitation at distance, but also improves the patients' social competences, and provides direct supervision of therapists. In this paper we introduce telelife to enhance telerehabilitation, and investigate technical challenges and possible methods to achieve telelife.

sted, utgiver, år, opplag, sider
IEEE, 2014
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-88093 (URN)10.1109/APSIPA.2014.7041675 (DOI)000392861900163 ()978-6-1636-1823-8 (ISBN)
Konferanse
2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), 9-12 December 2014, Chiang Mai, Thailand
Tilgjengelig fra: 2014-04-23 Laget: 2014-04-23 Sist oppdatert: 2018-06-08bibliografisk kontrollert
ur Réhman, S., Khan, M. S., Liu, L. & Li, H. (2014). Vibrotactile TV for immersive experience. In: Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific: . Paper presented at Annual Summit and Conference of Asia-Pacific-Signal-and-Information-Processing-Association (APSIPA), DEC 09-12, 2014, Angkor, CAMBODIA.
Åpne denne publikasjonen i ny fane eller vindu >>Vibrotactile TV for immersive experience
2014 (engelsk)Inngår i: Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific, 2014Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Audio and video are two powerful media forms to shorten the distance between audience/viewer and actors or players in the TV and films. The recent research shows that today people are using more and more multimedia contents on mobile devices, such as tablets and smartphones. Therefore, an important question emerges - how can we render high-quality, personal immersive experiences to consumers on these systems? To give audience an immersive engagement that differs from `watching a play', we have designed a study to render complete immersive media which include the `emotional information' based on augmented vibrotactile-coding on the back of the user along with audio-video signal. The reported emotional responses to videos viewed with and without haptic enhancement, show that participants exhibited an increased emotional response to media with haptic enhancement. Overall, these studies suggest that the effectiveness of our approach and using a multisensory approach increase immersion and user satisfaction.

Emneord
tactile interface, multimodal interaction, interactive TV, Audio-Video processing
HSV kategori
Forskningsprogram
datoriserad bildanalys; människa-datorinteraktion; signalbehandling
Identifikatorer
urn:nbn:se:umu:diva-101711 (URN)10.1109/APSIPA.2014.7041631 (DOI)000392861900119 ()978-6-1636-1823-8 (ISBN)
Konferanse
Annual Summit and Conference of Asia-Pacific-Signal-and-Information-Processing-Association (APSIPA), DEC 09-12, 2014, Angkor, CAMBODIA
Tilgjengelig fra: 2015-04-09 Laget: 2015-04-09 Sist oppdatert: 2018-06-07bibliografisk kontrollert
Abedan Kondori, F., Yousefi, S. & Liu, L. (2013). Active human gesture capture for diagnosing and treating movement disorders. In: : . Paper presented at Proceeding of The Swedish Symposium on Image Analysis (SSBA2013), Gothenburg, Sweden.
Åpne denne publikasjonen i ny fane eller vindu >>Active human gesture capture for diagnosing and treating movement disorders
2013 (engelsk)Konferansepaper, Publicerat paper (Annet vitenskapelig)
Abstract [en]

Movement disorders prevent many people fromenjoying their daily lives. As with other diseases, diagnosisand analysis are key issues in treating such disorders.Computer vision-based motion capture systems are helpfultools for accomplishing this task. However Classical motiontracking systems suffer from several limitations. First theyare not cost effective. Second these systems cannot detectminute motions accurately. Finally they are spatially limitedto the lab environment where the system is installed. In thisproject, we propose an innovative solution to solve the abovementionedissues. Mounting the camera on human body, webuild a convenient, low cost motion capture system that canbe used by the patient in daily-life activities. We refer tothis system as active motion capture, which is not confinedto the lab environment. Real-time experiments in our labrevealed the robustness and accuracy of the system.

HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-83352 (URN)
Konferanse
Proceeding of The Swedish Symposium on Image Analysis (SSBA2013), Gothenburg, Sweden
Tilgjengelig fra: 2013-11-22 Laget: 2013-11-22 Sist oppdatert: 2018-06-08bibliografisk kontrollert
Abedan Kondori, F. & Liu, L. (2012). 3D Active Human Motion Estimation for Biomedical Applications. In: Mian Long (Ed.), World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China: . Paper presented at World Congress on Medical Physics and Biomedical Engineering (WC 2012), Beijing, China, 26-31 May 2012 (pp. 1014-1017). Springer Berlin/Heidelberg
Åpne denne publikasjonen i ny fane eller vindu >>3D Active Human Motion Estimation for Biomedical Applications
2012 (engelsk)Inngår i: World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China / [ed] Mian Long, Springer Berlin/Heidelberg, 2012, , s. 4s. 1014-1017Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Movement disorders forbid many people from enjoying their daily lives. As with other diseases, diagnosis and analysis are key issues in treating such disorders. Computer vision-based motion capture systems are helpful tools for accomplishing this task. However Classical motion tracking systems suffer from several limitations. First they are not cost effective. Second these systems cannot detect minute motions accurately. Finally they are spatially limited to the lab environment where the system is installed. In this project, we propose an innovative solution to solve the above-mentioned issues. Mounting the camera on human body, we build a convenient, low cost motion capture system that can be used by the patient while practicing daily-life activities. We refer to this system as active motion capture, which is not confined to the lab environment. Real-time experiments in our lab revealed the robustness and accuracy of the system.

sted, utgiver, år, opplag, sider
Springer Berlin/Heidelberg, 2012. s. 4
Serie
IFMBE Proceedings, ISSN 1680-0737 ; 39
Emneord
Active motion tracking, Human motion analysis, Movement disorder, SIFT
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-55831 (URN)10.1007/978-3-642-29305-4_266 (DOI)978-364229304-7 (print) (ISBN)978-3-642-29305-4 (ISBN)
Konferanse
World Congress on Medical Physics and Biomedical Engineering (WC 2012), Beijing, China, 26-31 May 2012
Tilgjengelig fra: 2012-06-04 Laget: 2012-06-04 Sist oppdatert: 2018-06-08bibliografisk kontrollert
Chen, M., Shi, X., Chen, Y., Cao, Z., Cheng, R., Xu, Y., . . . Li, X. (2012). A prospective study of pain experience in a neonatal intensive care unit of China. The Clinical Journal of Pain, 28(8), 700-704
Åpne denne publikasjonen i ny fane eller vindu >>A prospective study of pain experience in a neonatal intensive care unit of China
Vise andre…
2012 (engelsk)Inngår i: The Clinical Journal of Pain, ISSN 0749-8047, E-ISSN 1536-5409, Vol. 28, nr 8, s. 700-704Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Objectives: To assess pain burden in neonates during their hospitalization in China and thus provide evidence for the necessity of neonatal pain management. Patients and Methods: The Neonatal Facial Coding System was used to evaluate pain in neonates. We prospectively collected data of all painful procedures performed on 108 neonates (term, 62; preterm, 46) recruited from admission to discharge in a neonatal intensive care unit of a university-affiliated hospital in China. Results: We found that during hospitalization each preterm and term neonate was exposed to a median of 100.0 (range, 11 to 544) and 56.5 (range, 12 to 249) painful procedures, respectively. Most of the painful procedures were performed within the first 3 days. Preterm neonates, especially those born at 28 and 29 weeks' gestational age, experienced more pain than those born at 30 weeks' gestation or later (P < 0.001). Among those painful procedures, tracheal aspiration was the most frequently performed on preterm neonates, and intravenous cannulation was the most common for term neonates. Moreover, tracheal intubations and femoral venous puncture were found to be the most painful. Notably, none of the painful procedures was accompanied by analgesia. Conclusions: Neonates, particularly preterm neonates, were exposed to numerous invasive painful procedures without appropriate analgesia in hospitals in China. The potential long-term impacts of poorly treated pain in neonates call for a change in pediatric practice in China and in countries with similar practices.

sted, utgiver, år, opplag, sider
Philadelphia, PA, USA: Lippingcott Williams & Wilkins, 2012
Emneord
neonates, procedural pain, neonatal intensive care unit, analgesia
HSV kategori
Identifikatorer
urn:nbn:se:umu:diva-60504 (URN)10.1097/AJP.0b013e3182400d54 (DOI)000308672100008 ()
Tilgjengelig fra: 2012-10-16 Laget: 2012-10-15 Sist oppdatert: 2018-06-08bibliografisk kontrollert
Organisasjoner