Posts Tagged Leap Motion Controller

[Abstract] Autonomous Use of the Home Virtual Rehabilitation System: A Feasibility and Pilot Study

Objective: This article describes the findings of a study examining the ability of persons with strokes to use home virtual rehabilitation system (HoVRS), a home-based rehabilitation system, and the impact of motivational enhancement techniques on subjects’ motivation, adherence, and motor function improvements subsequent to a 3-month training program.

Materials and Methods: HoVRS integrates a Leap Motion controller, a passive arm support, and a suite of custom-designed hand rehabilitation simulations. For this study, we developed a library of three simulations, which include activities such as flexing and extending fingers to move a car, flying a plane with wrist movement, and controlling an avatar running in a maze using reaching movements. Two groups of subjects, the enhanced motivation (EM) group and the unenhanced control (UC) group, used the system for 12 weeks in their homes. The EM group trained using three simulations that provided 8–12 levels of difficulty and complexity. Graphics and scoring opportunities increased at each new level. The UC group performed the same simulations, but difficulty was increased utilizing an algorithm that increased difficulty incrementally, making adjustments imperceptible.

Results: Adherence to both the EM and UC protocols exceeded adherence to home exercise programs described in the stroke rehabilitation literature. Both groups demonstrated improvements in upper extremity function. Intrinsic motivation levels were better for the EM group and motivation levels were maintained for the 12-week protocol.

Conclusion: A 12-week home-based training program using HoVRS was feasible. Motivational enhancement may have a positive impact on motivation, adherence, and motor outcome.

 

via Autonomous Use of the Home Virtual Rehabilitation System: A Feasibility and Pilot Study | Games for Health Journal

, , , , , , , , , , , , , ,

Leave a comment

[Abstract] Hand Rehabilitation via Gesture Recognition Using Leap Motion Controller – Conference Paper

I. Introduction

Nowadays, a stroke is the fourth leading cause of death in the United States. In fact, every 40 seconds, someone in the US is having a stroke. Moreover, around 50% of stroke survivors suffer damage to the upper extremity [1]–[3]. Many actions of treating and recovering from a stroke have been developed over the years, but recent studies show that combining the recovery process with the existing rehabilitation plan provides better results and a raise in the patients quality of life [4]–[6]. Part of the stroke recovery process is a rehabilitation plan [7]. The process can be difficult, intensive and long depending on how adverse the stroke and which parts of the brain were damaged. These processes usually involve working with a team of health care providers in a full extensive rehabilitation plan, which includes hospital care and home exercises.

References

1. D. Tsoupikova, N. S. Stoykov, M. Corrigan, K. Thielbar, R. Vick, Y. Li, K. Triandafilou, F. Preuss, D. Kamper, “Virtual immersion for poststroke hand rehabilitation therapy”, Annals of biomedical engineering, vol. 43, no. 2, pp. 467-477, 2015.

2. J. E. Pompeu, T. H. Alonso, I. B. Masson, S. M. A. A. Pompeu, C. Torriani-Pasin, “The effects of virtual reality on stroke rehabilitation: a systematic review”, Motricidade, vol. 10, no. 4, pp. 111-122, 2014.

3. J.-H. Shin, S. B. Park, S. H. Jang, “Effects of game-based virtual reality on health-related quality of life in chronic stroke patients: A randomized controlled study”, Computers in biology and medicine, vol. 63, pp. 92-98, 2015.

4. R. W. Teasell, L. Kalra, “Whats new in stroke rehabilitation”, Stroke, vol. 35, no. 2, pp. 383-385, 2004.

5. E. McDade, S. Kittner, “Ischemic stroke in young adults” in Stroke Essentials for Primary Care, Springer, pp. 123-146, 2009.

6. P. Langhorne, J. Bernhardt, G. Kwakkel, “Stroke rehabilitation”, The Lancet, vol. 377, no. 9778, pp. 1693-1702, 2011.

7. C. J. Winstein, J. Stein, R. Arena, B. Bates, L. R. Cherney, S. C. Cramer, F. Deruyter, J. J. Eng, B. Fisher, R. L. Harvey et al., “Guidelines for adult stroke rehabilitation and recovery: a guideline for healthcare professionals from the american heart association/american stroke association”, Stroke, vol. 47, no. 6, pp. e98-e169, 2016.

8. R. Ibanez, A. Soria, A. Teyseyre, M. Campo, “Easy gesture recognition for kinect”, Advances in Engineering Software, vol. 76, pp. 171-180, 2014.

9. R. Ibañez, A. Soria, A. R. Teyseyre, L. Berdun, M. R. Campo, “A comparative study of machine learning techniques for gesture recognition using kinect”, Handbook of Research on Human-Computer Interfaces Developments and Applications, pp. 1-22, 2016.

10. S. Bhattacharya, B. Czejdo, N. Perez, “Gesture classification with machine learning using kinect sensor data”, Emerging Applications of Information Technology (EAIT) 2012 Third International Conference on, pp. 348-351, 2012.

11. K. Laver, S. George, S. Thomas, J. E. Deutsch, M. Crotty, “Virtual reality for stroke rehabilitation”, Stroke, vol. 43, no. 2, pp. e20-e21, 2012.

12. G. Saposnik, M. Levin, S. O. R. C. S. W. Group et al., “Virtual reality in stroke rehabilitation: a meta-analysis and implications for clinicians”, Stroke, vol. 42, no. 5, pp. 1380-1386, 2011.

13. K. R. Anderson, M. L. Woodbury, K. Phillips, L. V. Gauthier, “Virtual reality video games to promote movement recovery in stroke rehabilitation: a guide for clinicians”, Archives of physical medicine and rehabilitation, vol. 96, no. 5, pp. 973-976, 2015.

14. A. Estepa, S. S. Piriz, E. Albornoz, C. Martínez, “Development of a kinect-based exergaming system for motor rehabilitation in neurological disorders”, Journal of Physics: Conference Series, vol. 705, pp. 012060, 2016.

15. E. Chang, X. Zhao, S. C. Cramer et al., “Home-based hand rehabilitation after chronic stroke: Randomized controlled single-blind trial comparing the musicglove with a conventional exercise program”, Journal of rehabilitation research and development, vol. 53, no. 4, pp. 457, 2016.

16. L. Ebert, P. Flach, M. Thali, S. Ross, “Out of touch-a plugin for controlling osirix with gestures using the leap controller”, Journal of Forensic Radiology and Imaging, vol. 2, no. 3, pp. 126-128, 2014.

17. W.-J. Li, C.-Y. Hsieh, L.-F. Lin, W.-C. Chu, “Hand gesture recognition for post-stroke rehabilitation using leap motion”, Applied System Innovation (ICASI) 2017 International Conference on, pp. 386-388, 2017.

18. K. Vamsikrishna, D. P. Dogra, M. S. Desarkar, “Computer-vision-assisted palm rehabilitation with supervised learning”, IEEE Transactions on Biomedical Engineering, vol. 63, no. 5, pp. 991-1001, 2016.

19. A. Butt, E. Rovini, C. Dolciotti, P. Bongioanni, G. De Petris, F. Cavallo, “Leap motion evaluation for assessment of upper limb motor skills in parkinson’s disease”, Rehabilitation Robotics (ICORR) 2017 International Conference on, pp. 116-121, 2017.

20. L. Di Tommaso, S. Aubry, J. Godard, H. Katranji, J. Pauchot, “A new human machine interface in neurosurgery: The leap motion (®). technical note regarding a new touchless interface”, Neuro-Chirurgie, vol. 62, no. 3, pp. 178-181, 2016.

21. O. Chapelle, “Training a support vector machine in the primal”, Neural computation, vol. 19, no. 5, pp. 1155-1178, 2007.

22. Y. Ma, G. Guo, Support vector machines applications, Springer, 2014.

23. J. Guna, G. Jakus, M. Pogačnik, S. Tomažič, J. Sodnik, “An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking”, Sensors, vol. 14, no. 2, pp. 3702-3720, 2014.

24. T. DOrazio, R. Marani, V. Renó, G. Cicirelli, “Recent trends in gesture recognition: how depth data has improved classical approaches”, Image and Vision Computing, vol. 52, pp. 56-72, 2016.

25. L. Motion, Leap motion sdk, 2015.

 

via Hand Rehabilitation via Gesture Recognition Using Leap Motion Controller – IEEE Conference Publication

, , , , , , , , , , , ,

Leave a comment

[ARTICLE] Measurements by A LEAP-Based Virtual Glove for the Hand Rehabilitation – Full Text

Abstract

Hand rehabilitation is fundamental after stroke or surgery. Traditional rehabilitation requires a therapist and implies high costs, stress for the patient, and subjective evaluation of the therapy effectiveness. Alternative approaches, based on mechanical and tracking-based gloves, can be really effective when used in virtual reality (VR) environments. Mechanical devices are often expensive, cumbersome, patient specific and hand specific, while tracking-based devices are not affected by these limitations but, especially if based on a single tracking sensor, could suffer from occlusions. In this paper, the implementation of a multi-sensors approach, the Virtual Glove (VG), based on the simultaneous use of two orthogonal LEAP motion controllers, is described. The VG is calibrated and static positioning measurements are compared with those collected with an accurate spatial positioning system. The positioning error is lower than 6 mm in a cylindrical region of interest of radius 10 cm and height 21 cm. Real-time hand tracking measurements are also performed, analysed and reported. Hand tracking measurements show that VG operated in real-time (60 fps), reduced occlusions, and managed two LEAP sensors correctly, without any temporal and spatial discontinuity when skipping from one sensor to the other. A video demonstrating the good performance of VG is also collected and presented in the Supplementary Materials. Results are promising but further work must be done to allow the calculation of the forces exerted by each finger when constrained by mechanical tools (e.g., peg-boards) and for reducing occlusions when grasping these tools. Although the VG is proposed for rehabilitation purposes, it could also be used for tele-operation of tools and robots, and for other VR applications.

1. Introduction

Hand rehabilitation is extremely important for recovering from post-stroke or post-surgery residual impairments and its effectiveness depends on frequency, duration and quality of the rehabilitation sessions [1]. Traditional rehabilitation requires a therapist for driving and controlling patients during sessions. Procedure effectiveness is evaluated subjectively by the therapist, basing on experience. In the last years, several automated (tele)rehabilitation gloves, based on mechanical devices or tracking sensors, have been presented [2,3,4,5,6,7,8,9,10]. These gloves allow the execution of therapy at home and rehabilitation effectiveness can be analytically calculated and summarized in numerical parameters, controlled by therapists through Internet. Moreover, these equipment can be easily interfaced with virtual reality (VR) environments [11], which have been proven to increase rehabilitation efficacy [12]. Mechanical devices are equipped with pressure sensors and pneumatic actuators for assisting and monitoring the hand movements and for applying forces to which the patient has to oppose [13,14]. However, they are expensive, cumbersome, patient specific (different patients cannot reuse the same system) and hand specific (the patient cannot use the same system indifferently with both hands). Tracking-based gloves consist of computer vision algorithms for the analysis and interpretation of videos from depth sensing sensors to calculate hand kinematics in real time [10,15,16,17,18,19]. Besides depth sensors, LEAP [20] is a small and low-cost hand 3D tracking device characterized by high-resolution and high-reactivity [21,22,23], used in VR [24], and has been recently presented and tested with success in the hand rehabilitation, with exercises designed in VR environments [25]. Despite the advantages of using LEAP with VR, a single sensor does not allow accurate quantitative evaluation of hand and fingers tracking in case of occlusions. The system proposed in [10] consisted on two orthogonal LEAPs designed to reduce occlusions and to improve objective hand-tracking evaluation. The two sensors were fixed to a wood support that maintained them orthogonal each other. The previous prototype was useful to test the robustness of each sensor, in presence of the other, to the potential infra-red interferences, to evaluate the maintenance of the maximum operative range of each sensor and, finally, to demonstrate the hand tracking idea. However, it was imprecise, due to the usage of raw VG support and positioning system, the non-optimal reciprocal positioning of the sensors, and the impossibility of performing a reciprocal calibration independent of the sensors measurements. This fact did not allow the evaluation of the intrinsic precision of the VG and to perform accurate, real-time quantitative hand tracking measurements. In this paper, we present a method for constructing an engineered version of the LEAP based VG, a technique for its accurate calibration and for collecting accurate positioning measurements and high-quality evaluation of positioning errors, specific of VG. Moreover, real-time experimental hand tracking measurements were collected (a video demonstrating its real-time performance and precision was also provided in the Supplementary Materials), presented and discussed.[…]

 

Continue —>  Measurements by A LEAP-Based Virtual Glove for the Hand Rehabilitation

An external file that holds a picture, illustration, etc.Object name is sensors-18-00834-g001.jpg

Figure 1
VG mounted on its aluminium support.

, , , , , , , , ,

Leave a comment

[Abstract & References] A human–computer interface for wrist rehabilitation: a pilot study using commercial sensors to detect wrist movements

Abstract

Health conditions might cause muscle weakness and immobility in some body parts; hence, physiotherapy exercises play a key role in the rehabilitation. To improve the engagement during the rehabilitation process, we therefore propose a human–computer interface (serious game) in which five wrist movements (extension, flexion, pronation, supination and neutral) are detected via two commercial sensors (Leap motion controller and Myo armband). Leap motion provides data regarding positions of user’s finger phalanges through two infrared cameras, while Myo armband facilitates electromyography signal and inertial motion of user’s arm through its electrodes and inertial measurement unit. The main aim of this study is to explore the performance of these sensors on wrist movement recognition in terms of accuracy, sensitivity and specificity. Eight healthy participants played 5 times a proposed game with each sensor in one session. Both sensors reported over 85% average recognition accuracy in the five wrist movements. Based on t test and Wilcoxon signed-rank test, early results show that there were significant differences between Leap motion controller and Myo armband recognitions in terms of average sensitivities on extension (p=0.0356p=0.0356), flexion (p=0.0356p=0.0356) and pronation (p=0.0440p=0.0440) movements, and average specificities on extension (p=0.0276p=0.0276) and pronation (p=0.0249p=0.0249) movements.

References

1.
Gunasekera, W.L., Bendall, J.: Rehabilitation of neurologically injured patients in neurosurgery. Part of the series Springer Specialist Surgery Series. In: Moore, A.J., Newell, D.W. (eds.) Springer, London, pp. 407–421 (2005)Google Scholar
2.
Rego, P., Moreira, P.M., Reis, L.P.: Serious games for rehabilitation: a survey and a classification towards a taxonomy. In: Proceedings of the 5th Iberian Conference on Information Systems and Technologies, pp. 1–6. IEEE (2010)Google Scholar
3.
Granic, I., Lobel, A., Engels, R.C.: The benefits of playing video games. Am. Psychol. 69(1), 66–78 (2014)CrossRefGoogle Scholar
4.
Russoniello, C.V., O’Brien, K., Parks, J.M.: EEG, HRV and psychological correlates while playing Bejeweled II: a randomized controlled study. In: Wiederhold, B.K., Riva, G. (eds.) Annual Review of Cybertherapy and Telemedicine. Advanced Technologies in the Behavioral, Social and Neurosciences, pp. 189–192. IOS Press, Amsterdam (2009)Google Scholar
5.
Zyda, M.: From visual simulation to virtual reality to games. Computer 38(9), 25–32 (2005)CrossRefGoogle Scholar
6.
Ma, M., Bechkoum, K.: Serious games for movement therapy after stroke. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, pp. 1872–1877. IEEE (2008)Google Scholar
7.
Boian, R., Sharma, A., Han, C., Merians, A., Burdea, G., Adamovich, S., Poizner, H.: Virtual reality-based post-stroke hand rehabilitation. In: Proceedings of the Medicine Meets Virtual Reality Conference, pp. 64–70. IOS Press (2002)Google Scholar
8.
Asadipour, A., Debattista, K., Chalmers, A.: Visuohaptic augmented feedback for enhancing motor skills acquisition. Vis. Comput. 33(4), 401–411 (2017)CrossRefGoogle Scholar
9.
Burke, J.W., McNeill, M.D.J., Charles, D.K., Morrow, P.J., Crosbie, J.H., McDonough, S.M.: Optimising engagement for stroke rehabilitation using serious games. Vis. Comput. 25(12), 1085–1099 (2009)CrossRefGoogle Scholar
10.
Burke, J.W., McNeill, M., Charles, D., Morrow, P., Crosbie, J., McDonough, S.: Serious games for upper limb rehabilitation following stroke. In: Proceedings of the Games and Virtual Worlds for Serious Applications, pp. 103–110. IEEE (2009)Google Scholar
11.
Aristidou, A.: Hand tracking with physiological constraints. Vis. Comput. (2016). doi:10.1007/s00371-016-1327-8
12.
Chang, Y.J., Han, W.Y., Tsai, Y.C.A.: Kinect-based upper limb rehabilitation system to assist people with cerebral palsy. Res. Dev. Disabil. 34(11), 3654–3659 (2013)CrossRefGoogle Scholar
13.
Roy, A.K., Soni, Y., Dubey, S.: Enhancing effectiveness of motor rehabilitation using Kinect motion sensing technology. In: Proceedings of the Global Humanitarian Technology Conference: South Asia Satellite (GHTC-SAS), pp. 298–304. IEEE (2013)Google Scholar
14.
Pedersoli, F., Benini, S., Adami, N., Leonardi, R.: XKin: an open source framework for hand pose and gesture recognition using kinect. Vis. Comput. 30(10), 1107–1122 (2014)CrossRefGoogle Scholar
15.
Da Gama, A., Fallavollita, P., Teichrieb, V., Navab, N.: Motor rehabilitation using Kinect: a systematic review. Games Health J. 4(2), 123–135 (2015)CrossRefGoogle Scholar
16.
Chuan, C.H., Regina, E., Guardino, C.: American Sign Language recognition using Leap motion sensor. In: Proceedings of the 13th International Conference on Machine Learning and Applications (ICMLA), pp. 541–544. IEEE (2014)Google Scholar
17.
Marin, G., Dominio, F., Zanuttigh, P.: Hand gesture recognition with Leap motion and Kinect devices. In: Proceedings of the International Conference on Image Processing (ICIP), pp. 1565–1569. IEEE (2014)Google Scholar
18.
Funasaka, M., Ishikawa, Y., Takata, M., Joe, K.: Sign language recognition using leap motion controller. In: Proceedings of the International Conference on Parallel and Distributed Processing Techniques and Applications (PDPTA), pp. 263–269 (2015)Google Scholar
19.
Bizzotto, N., Costanzo, A., Bizzotto, L.: Leap motion gesture control with OsiriX in the operating room to control imaging: first experiences during live surgery. Surg. Innov. 21(6), 655–656 (2014)CrossRefGoogle Scholar
20.
Mewes, A., Saalfeld, P., Riabikin, O., Skalej, M., Hansen, C.A.: Gesture-controlled projection display for CT-guided interventions. Int. J. Comput. Assist. Radiol. Surg. 11(1), 157–164 (2016)CrossRefGoogle Scholar
21.
Bassily, D., Georgoulas, C., Guettler, J., Linner, T., Bock, T.: Intuitive and adaptive robotic arm manipulation using the Leap motion controller. In: Proceedings of the International Symposium on Robotics ISR/Robotik, pp. 1–7. IEEE (2014)Google Scholar
22.
dos Reis Alves, S.F., Uribe-Quevedo, A.J., Nunes da Silva, I., Ferasoli Filho, H.: Pomodoro, a mobile robot platform for hand motion exercising. In: Proceedings of the International Conference on Biomedical Robotics and Biomechatronics, pp. 970–974. IEEE (2014)Google Scholar
23.
Yu, N., Xu, C., Wang, K., Yang, Z., Liu, J.: Gesture-based telemanipulation of a humanoid robot for home service tasks. In: Proceedings of the International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), pp. 1923–1927. IEEE (2015)Google Scholar
24.
Sonntag, D., Orlosky, J., Weber, M., Gu, Y., Sosnovsky, S., Toyama, T., Toosi, E.N.: Cognitive monitoring via eye tracking in virtual reality pedestrian environments. In: Proceedings of the 4th International Symposium on Pervasive Displays, pp. 269–270. ACM (2015)Google Scholar
25.
Khademi, M., Mousavi Hondori, H., McKenzie, A., Dodakian, L., Lopes, C.V., Cramer, S.C.: Free-hand interaction with Leap motion controller for stroke rehabilitation. In: Proceedings of the CHI’14 Extended Abstracts on Human Factors in Computing Systems, pp. 1663–1668. ACM (2014)Google Scholar
26.
Charles, D., Pedlow, K., McDonough, S., Shek, K., Charles, T.: Close range depth sensing cameras for virtual reality based hand rehabilitation. J. Assist. Technol. 8(3), 138–149 (2014)CrossRefGoogle Scholar
27.
Grubišić, I., Skala Kavanagh, H.A.N.A., Grazio, S.: Novel approaches in hand rehabilitation. Period. Biol. 117(1), 139–145 (2015)Google Scholar
28.
Qamar, A., Rahman, M.A., Basalamah, S.: Adding inverse kinematics for providing live feedback in a serious game-based rehabilitation system. In: Proceedings of the International Conference on Intelligent Systems, Modelling and Simulation (ISMS), pp. 215–220. IEEE (2014)Google Scholar
29.
Liang, H., Chang, J., Kazmi, I.K., Zhang, J.J., Jiao, P.: Hand gesture-based interactive puppetry system to assist storytelling for children. Vis. Comput. 33(4), 517–531 (2017)CrossRefGoogle Scholar
30.
Shen, J., Luo, Y., Wu, Z., Tian, Y., Deng, Q.: CUDAbased real-time hand gesture interaction and visualization for CT volume dataset using leap motion. Vis. Comput. 32(3), 359–370 (2016)CrossRefGoogle Scholar
31.
Hettig, J., Mewes, A., Riabikin, O., Skalej, M., Preim, B., Hansen, C.: Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Proceedings of the Eurographics Workshop on Visual Computing for Biology and Medicine, pp. 177–185. ACM (2015)Google Scholar
32.
Gándara, C.V., Bauza, C.G.: IntelliHome: A framework for the development of ambient assisted living applications based in low-cost technology. In: Proceedings of the Latin American Conference on Human Computer Interaction, p. 18. ACM (2015)Google Scholar
33.
Qamar, A.M., Khan, A.R., Husain, S.O., Rahman, M.A., Baslamah, S.A.: Multi-sensory gesture-based occupational therapy environment for controlling home appliances. In: Proceedings of the 5th ACM on International Conference on Multimedia Retrieval, pp. 671–674. ACM (2015)Google Scholar
34.
McCullough, M., Xu, H., Michelson, J., Jackoski, M., Pease, W., Cobb, W., Williams, B.: Myo arm: swinging to explore a VE. In: Proceedings of the ACM SIGGRAPH Symposium on Applied Perception, pp. 107–113. ACM (2015)Google Scholar
35.
Rondinelli, R.D., Genovese, E., Brigham, C.R.: Guides to the Evaluation of Permanent Impairment. American Medical Association, Chicago (2008)Google Scholar
36.
Norkin, C.C., White, D.J.: Measurement of Joint Motion: A Guide to Goniometry, 5th edn. FA Davis Company, Philadelphia (2016)Google Scholar
37.
O’brien, A.V., Jones, P., Mullis, R., Mulherin, D., Dziedzic, K.: Conservative hand therapy treatments in rheumatoid arthritisa randomized controlled trial. Rheumatology 45(5), 577–583 (2006)CrossRefGoogle Scholar
38.
Wakefield, A.E., McQueen, M.M.: The role of physiotherapy and clinical predictors of outcome after fracture of the distal radius. J. Bone Joint Surg. 82(7), 972–976 (2000)CrossRefGoogle Scholar

Source: A human–computer interface for wrist rehabilitation: a pilot study using commercial sensors to detect wrist movements | SpringerLink

, , , ,

Leave a comment

[ARTICLE] A Review on Technical and Clinical Impact of Microsoft Kinect on Physical Therapy and Rehabilitation – Full Text HTML

Abstract

This paper reviews technical and clinical impact of the Microsoft Kinect in physical therapy and rehabilitation. It covers the studies on patients with neurological disorders including stroke, Parkinson’s, cerebral palsy, and MS as well as the elderly patients. Search results in Pubmed and Google scholar reveal increasing interest in using Kinect in medical application. Relevant papers are reviewed and divided into three groups: (1) papers which evaluated Kinect’s accuracy and reliability, (2) papers which used Kinect for a rehabilitation system and provided clinical evaluation involving patients, and (3) papers which proposed a Kinect-based system for rehabilitation but fell short of providing clinical validation. At last, to serve as technical comparison to help future rehabilitation design other sensors similar to Kinect are reviewed.

  1. Introduction

Traditionally, a great portion of physical therapy and rehabilitation assessment of stroke patients is based on a therapist’s observation and judgment. The assessments methods (e.g., Fugl-Meyer et al. Assessment of Physical Performance [1]) rely heavily on the therapists visual assessment of how the patient is performing a standard task. This process needs a trained Physical Therapist (PT) or Occupational Therapist (OCT) to spend one on one time with the patient. Yet, the assessment can be inaccurate for several reasons one of which is the subjectivity of these behavioral and clinical assessments. Sensor and computing technology that can be used for motion capture have advanced drastically in the past few years; they have become more capable and affordable. Motion capture systems (MoCap) record human body’s kinematic data with high accuracy and reliability; analysis of MoCap data results in better clinical and behavioral assessment and more efficient therapeutic decision making accordingly.

 Full Text HTML–> A Review on Technical and Clinical Impact of Microsoft Kinect on Physical Therapy and Rehabilitation.

, , , , , , , , , ,

Leave a comment

%d bloggers like this: