Posts Tagged human computer interaction

[Abstract] Towards an Immersive Virtual Reality Game for Smarter Post-Stroke Rehabilitation

Abstract:

Traditional forms of physical therapy and rehabilitation are often based on therapist observation and judgment, coincidentally this process oftentimes can be inaccurate, expensive, and non-timely. Modern immersive Virtual Reality systems provide a unique opportunity to make the therapy process smarter. In this paper, we present an immersive virtual reality stroke rehabilitation game based on a widely accepted therapy method, Constraint-Induced Therapy, that was evaluated by nine post-stroke participants. We implement our game as a dynamically adapting system that can account for the user’s motor abilities while recording real-time motion capture and behavioral data. The game also can be used for tele-rehabilitation, effectively allowing therapists to connect with the participant remotely while also having access to +90Hz real-time biofeedback data. Our quantitative and qualitative results suggest that our system is useful in increasing affordability, accuracy, and accessibility of post-stroke motor treatment.

via Towards an Immersive Virtual Reality Game for Smarter Post-Stroke Rehabilitation – IEEE Conference Publication

, , , , , , , , , , ,

Leave a comment

[Abstract] Virtual Activities of Daily Living for Recovery of Upper Extremity Motor Function

Abstract

A study was conducted to investigate the effectiveness of virtual activities of daily living (ADL) practice using the SaeboVR software system for the recovery of upper extremity (UE) motor function following stroke. The system employs Kinect sensor-based tracking to translate human UE motion into the anatomical pose of the arm of the patient’s avatar within a virtual environment, creating a virtual presence within a simulated task space. Patients gain mastery of 12 different integrated activities while traversing a metaphorical ‘road to recovery’ that includes thematically linked levels and therapist-selected difficulty settings. Clinical trials were conducted under the study named Virtual Occupational Therapy Application. A total of 15 chronic phase stroke survivors completed a protocol involving three sessions per week over eight weeks, during which they engaged in repetitive task practice through performance of the virtual ADLs. Results show a clinically important improvement and statistically significant difference in Fugl-Meyer UE assessment scores in the study population of chronic stroke survivors over the eight-week interventional period compared with a non-interventional control period of equivalent duration. Statistically significant and clinically important improvements are also found in the wolf motor function test scores. These results provide new evidence for the use of virtual ADL practice as a tool for UE therapy for stroke patients. Limitations of the study include non-blinded assessments and the possibility of selection and/or attrition bias. © 2017 IEEE.

 

via Scopus preview – Scopus – Document details

, , , , , , , ,

Leave a comment

[Abstract+References] The Face Tracking System for Rehabilitation Robotics Applications – Conference paper

Abstract

The paper presents the working model of the face tracking system. The proposed solution may be used as one of the parts of the rehabilitation or assistive robotic system and serve as the robotic vision subsystem or as the module controlling robotic arm. It is a low-cost design, it is based on open source hardware and software components. As a hardware base the Raspberry Pi computer was used. The machine vision software is based on Python programming language and OpenCV computer vision library.

Preview

References

  1. 1.
    H. Zhou, H. Hu: Human motion tracking for rehabilitation – A survey, Biomedical Signal Processing and Control, Volume 3, Issue 1, pp. 1âĂŞ18, January 2008.Google Scholar
  2. 2.
    C. Rougier, J. Meunier, A. St-Arnaud, J. Rousseau: Monocular 3D Head Tracking to Detect Falls of Elderly People. Proceedings of the 28th IEEE EMBS Annual International Conference New York City, USA, Aug 30-Sept 3, 2006.Google Scholar
  3. 3.
    L. Boccanfuso, J. M. O’Kane: Adaptive Robot Design with Hand and Face Tracking for Use in Autism Therapy. Social Robotics, Second International Conference on Social Robotics, ICSR 2010, Singapore, November 23-24, 2010, pp. 265–274, DOI: 10.1007/978-3-642-17248-9_28.
  4. 4.
    T. Fong, I. Nourbakhsh, K. Dautenhahn: A survey of socially interactive robots. Robotics and Autonomous Systems, Volume 42, Issues 3âĂŞ4, pp. 143âĂŞ166, March 2003.Google Scholar
  5. 5.
    P. Jia, H. H. Hu, T. Lu, K. Yuan: Head gesture recognition for hands free control of an intelligent wheelchair. Industrial Robot: An International Journal, Vol. 34 Iss: 1, pp.60 âĂŞ 68, 2007.Google Scholar
  6. 6.
    P. Raif, J.A. Starzyk: Motivated learning in autonomous systems. The 2011 International Joint Conference on Neural Networks (IJCNN), pp. 603–610, 2011.Google Scholar
  7. 7.
    J.A. Starzyk, J.T. Graham, P. Raif, A.H. Tan: Motivated Learning for Autonomous Robots Development. Cognitive Science Research, 14, 1, 2011.Google Scholar
  8. 8.
  9. 9.
  10. 10.
  11. 11.
    J.E. Solem: Programming Computer Vision with Python: Tools and algorithms for analyzing images. O’Reilly Media 2012.Google Scholar
  12. 12.
  13. 13.
  14. 14.
    K. Demaagd, A. Oliver, N. Oostendorp, K. Scott: Practical Computer Vision with SimpleCV: The Simple Way to Make Technology See. O’Reilly Media 2012.Google Scholar
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
    J. Howse: OpenCV Computer Vision with Python. CreateSpace Independent Publishing Platform 2015.Google Scholar
  20. 20.
    G. Bradski, A. Kaehler: Learning OpenCV: Computer Vision with the OpenCV Library. O’Reilly Media 2008.Google Scholar
  21. 21.
    P. Viola, M. Jones: Rapid object detection using a boosted cascade of simple features. In: Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition. vol. 1, pp. 511–518, 2001.Google Scholar
  22. 22.
    P. Viola, M. Jones: Robust Real-time Object Detection. International Journal of Computer Vision 57(2), pp. 137–154, 2004.Google Scholar

via The Face Tracking System for Rehabilitation Robotics Applications | SpringerLink

, , , , , , ,

Leave a comment

[Abstract+References] Impact of commercial sensors in human computer interaction: a review

Abstract

Nowadays, the communication gap between humans and computers might be reduced due to multimodal sensors available in the market. Therefore, it is important to know the specifications of these sensors and how they are being used in order to create human computer interfaces, which tackle complex tasks. The purpose of this paper is to review recent research regarding the up-to-date application areas of the following sensors:

(1) Emotiv sensor, which identifies emotions, facial expressions, thoughts, and head movements from users through electroencephalography signals,

(2) Leap motion controller, which recognizes hand and arm movements via vision techniques,

(3) Myo armband, which identifies hand and arm movements using electromyography signals and inertial sensors, and

(4) Oculus rift, which provides immersion into virtual reality to users.

The application areas discussed in this manuscript go from assistive technology to virtual tours. Finally, a brief discussion regarding advantages and shortcomings of each sensor is presented.

References

  1. Abreu JG, Teixeira JM, Figueiredo LS, Teichrieb V (2016) Evaluating sign language recognition using the myo armband. In: Virtual and augmented reality (SVR), 2016 XVIII symposium on, IEEE, pp 64–70Google Scholar
  2. Bassily D, Georgoulas C, Guettler J, Linner T, Bock T (2014) Intuitive and adaptive robotic arm manipulation using the Leap motion controller. In: ISR/Robotik 2014; 41st international symposium on robotics; proceedings of, VDE, pp 1–7Google Scholar
  3. Bernardos AM, Sánchez JM, Portillo JI, Wang X, Besada JA, Casar JR (2016) Design and deployment of a contactless hand-shape identification system for smart spaces. J Ambient Intell Humaniz Comput 7(3):357–370CrossRefGoogle Scholar
  4. Blaha J, Gupta M (2014) Diplopia: A virtual reality game designed to help amblyopics. In: Virtual reality (VR), 2014 iEEE, IEEE, pp 163–164Google Scholar
  5. Boschmann A, Dosen S, Werner A, Raies A, Farina D (2016) A novel immersive augmented reality system for prosthesis training and assessment. In: Biomedical and health informatics (BHI), 2016 IEEE-EMBS international conference on, IEEE, pp 280–283Google Scholar
  6. Brennan CP, McCullagh PJ, Galway L, Lightbody G (2015) Promoting autonomy in a smart home environment with a smarter interface. In: Engineering in medicine and biology society (EMBC), 2015 37th annual international conference of the IEEE, IEEE, pp 5032–5035Google Scholar
  7. Cacace J, Finzi A, Lippiello V, Furci M, Mimmo N, Marconi L (2016) A control architecture for multiple drones operated via multimodal interaction in search & rescue mission. In: Safety, security, and rescue robotics (SSRR), 2016 IEEE international symposium on, IEEE, pp 233–239Google Scholar
  8. Carrino F, Tscherrig J, Mugellini E, Khaled OA, Ingold R (2011) Head-computer interface: a multimodal approach to navigate through real and virtual worlds. In: International conference on human-computer interaction, Springer, pp 222–230Google Scholar
  9. Charles D, Pedlow K, McDonough S, Shek K, Charles T (2014) Close range depth sensing cameras for virtual reality based hand rehabilitation. J Assist Technol 8(3):138–149CrossRefGoogle Scholar
  10. Chuan CH, Regina E, Guardino C (2014) American sign language recognition using Leap motion sensor. In: Machine learning and applications (ICMLA), 2014 13th international conference on, IEEE, pp 541–544Google Scholar
  11. Ciolan IM, Buraga SC, Dafinoiu I (2016) Oculus rift 3D interaction and nicotine craving: results from a pilot study. In: ROCHI–international conference on human-computer interaction, p 58Google Scholar
  12. Da Gama A, Fallavollita P, Teichrieb V, Navab N (2015) Motor rehabilitation using Kinect: a systematic review. Games Health J 4(2):123–135CrossRefGoogle Scholar
  13. dos Reis Alves SF, Uribe-Quevedo AJ, da Silva IN, Ferasoli Filho H (2014) Pomodoro, a mobile robot platform for hand motion exercising. In: Biomedical robotics and biomechatronics 2014 5th IEEE RAS & EMBS international conference on, IEEE, pp 970–974Google Scholar
  14. Duvinage M, Castermans T, Petieau M, Hoellinger T, Cheron G, Dutoit T (2013) Performance of the emotiv epoc headset for P300-based applications. Biomed Eng Online 12(1):56CrossRefGoogle Scholar
  15. Farahani N, Post R, Duboy J, Ahmed I, Kolowitz BJ, Krinchai T, Monaco SE, Fine JL, Hartman DJ, Pantanowitz L (2016) Exploring virtual reality technology and the Oculus rift for the examination of digital pathology slides. J Pathol Inform 7Google Scholar
  16. Fiałek S, Liarokapis F (2016) Comparing two commercial brain computer interfaces for serious games and virtual environments. In: Karpouzis K, Yannakakis GN (eds) Emotion in games, Springer, Switzerland, pp 103–117Google Scholar
  17. Funasaka M, Ishikawa Y, Takata M, Joe K (2015) Sign language recognition using Leap motion controller. In: Proceedings of the international conference on parallel and distributed processing techniques and applications (PDPTA), the steering committee of the world congress in computer science, computer engineering and applied computing (WorldComp), p 263Google Scholar
  18. Gándara CV, Bauza CG (2015) Intellihome: a framework for the development of ambient assisted living applications based in low-cost technology. In: Proceedings of the Latin American conference on human computer interaction, ACM, p 18Google Scholar
  19. Gomez-Gil J, San-Jose-Gonzalez I, Nicolas-Alonso LF, Alonso-Garcia S (2011) Steering a tractor by means of an EMG-based human-machine interface. Sensors 11(7):7110–7126CrossRefGoogle Scholar
  20. Gonzalez-Sanchez J, Chavez-Echeagaray ME, Atkinson R, Burleson W (2011) Abe: an agent-based software architecture for a multimodal emotion recognition framework. In: Software architecture (WICSA), 2011 9th working IEEE/IFIP conference on, IEEE, pp 187–193Google Scholar
  21. Grubišić I, Skala Kavanagh H, Grazio S (2015) Novel approaches in hand rehabilitation. Period Biol 117(1):139–145Google Scholar
  22. Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the Leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702–3720CrossRefGoogle Scholar
  23. Gunasekera WL, Bendall J (2005) Rehabilitation of neurologically injured patients. In: Moore AJ, Newell DW (eds) Neurosurgery, Springer, London, pp 407–421Google Scholar
  24. Güttler J, Shah R, Georgoulas C, Bock T (2015) Unobtrusive tremor detection and measurement via human-machine interaction. Proced Comput Sci 63:467–474CrossRefGoogle Scholar
  25. Han J, Shao L, Xu D, Shotton J (2013) Enhanced computer vision with Microsoft Kinect sensor: a review. IEEE Trans Cybern 43(5):1318–1334CrossRefGoogle Scholar
  26. Hettig J, Mewes A, Riabikin O, Skalej M, Preim B, Hansen C (2015) Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Proceedings of the eurographics workshop on visual computing for biology and medicine, eurographics association, pp 177–185Google Scholar
  27. Ijjada MS, Thapliyal H, Caban-Holt A, Arabnia HR (2015) Evaluation of wearable head set devices in older adult populations for research. In: Computational science and computational intelligence (CSCI), 2015 international conference on, IEEE, pp 810–811Google Scholar
  28. Jurcak V, Tsuzuki D, Dan I (2007) 10/20, 10/10, and 10/5 systems revisited: their validity as relative head-surface-based positioning systems. Neuroimage 34(4):1600–1611CrossRefGoogle Scholar
  29. Kefer K, Holzmann C, Findling RD (2016) Comparing the placement of two arm-worn devices for recognizing dynamic hand gestures. In: Proceedings of the 14th international conference on advances in mobile computing and multi media, ACM, pp 99–104Google Scholar
  30. Khademi M, Mousavi Hondori H, McKenzie A, Dodakian L, Lopes CV, Cramer SC (2014) Free-hand interaction with Leap motion controller for stroke rehabilitation. In: Proceedings of the extended abstracts of the 32nd annual ACM conference on human factors in computing systems, ACM, pp 1663–1668Google Scholar
  31. Khan FR, Ong HF, Bahar N (2016) A sign language to text converter using Leap motion. Int J Adv Sci Eng Inf Technol 6(6):1089–1095Google Scholar
  32. Kim SY, Kim YY (2012) Mirror therapy for phantom limb pain. Korean J Pain 25(4):272–274CrossRefGoogle Scholar
  33. Kiorpes L, McKeet SP (1999) Neural mechanisms underlying amblyopia. Curr Opin Neurobiol 9(4):480–486CrossRefGoogle Scholar
  34. Kleven NF, Prasolova-Førland E, Fominykh M, Hansen A, Rasmussen G, Sagberg LM, Lindseth F (2014) Training nurses and educating the public using a virtual operating room with Oculus rift. In: Virtual systems & multimedia (VSMM), 2014 international conference on, IEEE, pp 206–213Google Scholar
  35. Kutafina E, Laukamp D, Bettermann R, Schroeder U, Jonas SM (2016) Wearable sensors for elearning of manual tasks: Using forearm emg in hand hygiene training. Sensors 16(8):1221CrossRefGoogle Scholar
  36. Li C, Rusak Z, Horvath I, Kooijman A, Ji L (2016) Implementation and validation of engagement monitoring in an engagement enhancing rehabilitation system. IEEE Trans Neural Syst Rehabil Eng 25(6):726–738CrossRefGoogle Scholar
  37. Li C, Yang C, Wan J, Annamalai AS, Cangelosi A (2017) Teleoperation control of baxter robot using kalman filter-based sensor fusion. Syst Sci Control Eng 5(1):156–167CrossRefGoogle Scholar
  38. Liarokapis F, Debattista K, Vourvopoulos A, Petridis P, Ene A (2014) Comparing interaction techniques for serious games through brain-computer interfaces: a user perception evaluation study. Entertain Comput 5(4):391–399CrossRefGoogle Scholar
  39. Lupu RG, Ungureanu F, Stan A (2016) A virtual reality system for post stroke recovery. In: System theory, control and computing (ICSTCC), 2016 20th international conference on, IEEE, pp 300–305Google Scholar
  40. Marin G, Dominio F, Zanuttigh P (2014) Hand gesture recognition with Leap motion and Kinect devices. In: Image processing (ICIP), 2014 IEEE international conference on, IEEE, pp 1565–1569Google Scholar
  41. McCullough M, Xu H, Michelson J, Jackoski M, Pease W, Cobb W, Kalescky W, Ladd J, Williams B (2015) Myo arm: swinging to explore a VE. In: Proceedings of the ACM SIGGRAPH symposium on applied perception, ACM, pp 107–113Google Scholar
  42. Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C (2016) A gesture-controlled projection display for CT-guided interventions. Int J Comput Assist Radiol Surg 11(1):157–164CrossRefGoogle Scholar
  43. Mousavi Hondori H, Khademi M (2014) A review on technical and clinical impact of Microsoft Kinect on physical therapy and rehabilitation. J Med Eng 2014. doi:10.1155/2014/846514
  44. Nicola Bizzotto M, Alessandro Costanzo M, Leonardo Bizzotto M (2014) Leap motion gesture control with osirix in the operating room to control imaging: first experiences during live surgery. Surg Innov 1:2Google Scholar
  45. Nugraha BT, Sarno R, Asfani DA, Igasaki T, Munawar MN (2016) Classification of driver fatigue state based on EEG using Emotiv EPOC+. J Theor Appl Inf Technol 86(3):347Google Scholar
  46. Oskoei MA, Hu H (2007) Myoelectric control systems: a survey. Biomed Sign Process Control 2(4):275–294CrossRefGoogle Scholar
  47. Palmisano S, Mursic R, Kim J (2017) Vection and cybersickness generated by head-and-display motion in the Oculus rift. Displays 46:1–8CrossRefGoogle Scholar
  48. Phelan I, Arden M, Garcia C, Roast C (2015) Exploring virtual reality and prosthetic training. In: Virtual reality (VR), 2015 IEEE, IEEE, pp 353–354Google Scholar
  49. Powell C, Hatt SR (2009) Vision screening for amblyopia in childhood. Cochrane Database Syst Rev. doi:10.1002/14651858.CD005020.pub3
  50. Qamar A, Rahman MA, Basalamah S (2014) Adding inverse kinematics for providing live feedback in a serious game-based rehabilitation system. In: Intelligent systems, modelling and simulation (ISMS), 2014 5th international conference on, IEEE, pp 215–220Google Scholar
  51. Qamar AM, Khan AR, Husain SO, Rahman MA, Baslamah S (2015) A multi-sensory gesture-based occupational therapy environment for controlling home appliances. In: Proceedings of the 5th ACM on international conference on multimedia retrieval, ACM, pp 671–674Google Scholar
  52. Quesada L, López G, Guerrero L (2017) Automatic recognition of the american sign language fingerspelling alphabet to assist people living with speech or hearing impairments. J Ambient Intell Humaniz Comput 8(4):625–635Google Scholar
  53. Ramachandran VS, Rogers-Ramachandran D (2008) Sensations referred to a patient’s phantom arm from another subjects intact arm: perceptual correlates of mirror neurons. Med Hypotheses 70(6):1233–1234CrossRefGoogle Scholar
  54. Ranky G, Adamovich S (2010) Analysis of a commercial EEG device for the control of a robot arm. In: Bioengineering conference, proceedings of the 2010 IEEE 36th annual northeast, IEEE, pp 1–2Google Scholar
  55. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54CrossRefGoogle Scholar
  56. Rechy-Ramirez EJ, Hu H (2014) A flexible bio-signal based HMI for hands-free control of an electric powered wheelchair. Int J Artif Life Res (IJALR) 4(1):59–76CrossRefGoogle Scholar
  57. Simoens P, De Coninck E, Vervust T, Van Wijmeersch JF, Ingelbinck T, Verbelen T, Op de Beeck M, Dhoedt B (2014) Vision: smart home control with head-mounted sensors for vision and brain activity. In: Proceedings of the fifth international workshop on Mobile cloud computing & services, ACM, pp 29–33Google Scholar
  58. Snow PW, Loureiro RC, Comley R (2014) Design of a robotic sensorimotor system for phantom limb pain rehabilitation. In: Biomedical robotics and biomechatronics 2014 5th IEEE RAS & EMBS international conference on, IEEE, pp 120–125Google Scholar
  59. Sonntag D, Orlosky J, Weber M, Gu Y, Sosnovsky S, Toyama T, Toosi EN (2015) Cognitive monitoring via eye tracking in virtual reality pedestrian environments. In: Proceedings of the 4th international symposium on pervasive displays, ACM, pp 269–270Google Scholar
  60. Subha DP, Joseph PK, Acharya R, Lim CM (2010) EEG signal analysis: a survey. J Med Syst 34(2):195–212CrossRefGoogle Scholar
  61. Toutountzi T, Collander C, Phan S, Makedon F (2016) Eyeon: An activity recognition system using myo armband. In: Proceedings of the 9th ACM international conference on PErvasive technologies related to assistive environments, ACM, p 82Google Scholar
  62. Verkijika SF, De Wet L (2015) Using a brain-computer interface (BCI) in reducing math anxiety: evidence from South Africa. Comput Educ 81:113–122CrossRefGoogle Scholar
  63. Vikram S, Li L, Russell S (2013) Handwriting and gestures in the air, recognizing on the fly. Proc CHI 13:1179–1184Google Scholar
  64. Villagrasa S, Fonseca D, Durán J (2014) Teaching case: applying gamification techniques and virtual reality for learning building engineering 3D arts. In: Proceedings of the second international conference on technological ecosystems for enhancing multiculturality, ACM, pp 171–177Google Scholar
  65. Wake N, Sano Y, Oya R, Sumitani M, Kumagaya Si, Kuniyoshi Y (2015) Multimodal virtual reality platform for the rehabilitation of phantom limb pain. In: Neural engineering (NER), 2015 7th international IEEE/EMBS conference on, IEEE, pp 787–790Google Scholar
  66. Webel S, Olbrich M, Franke T, Keil J (2013) Immersive experience of current and ancient reconstructed cultural attractions. In: Digital heritage international congress (DigitalHeritage), 2013, IEEE, vol 1, pp 395–398Google Scholar
  67. Webster D, Celik O (2014) Systematic review of Kinect applications in elderly care and stroke rehabilitation. J Neuroeng Rehabil 11(1):108CrossRefGoogle Scholar
  68. Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the Leap motion controller. Sensors 13(5):6380–6393CrossRefGoogle Scholar
  69. Weisz J, Shababo B, Dong L, Allen PK (2013) Grasping with your face. In: Desai JP, Dudek G, Khatib O, Kumar V (eds) Experimental robotics, Springer, Heidelberg, pp 435–448Google Scholar
  70. Yu N, Xu C, Wang K, Yang Z, Liu J (2015) Gesture-based telemanipulation of a humanoid robot for home service tasks. In: Cyber technology in automation, control, and intelligent systems (CYBER), 2015 IEEE international conference on, IEEE, pp 1923–1927Google Scholar
  71. Zecca M, Micera S, Carrozza MC, Dario P (2002) Control of multifunctional prosthetic hands by processing the electromyographic signal. Crit Rev Biomed Eng 30:4–6CrossRefGoogle Scholar
  72. Zyda M (2005) From visual simulation to virtual reality to games. Computer 38(9):25–32CrossRefGoogle Scholar

Source: Impact of commercial sensors in human computer interaction: a review | SpringerLink

, , , , , ,

Leave a comment

[Conference paper] Assistance System for Rehabilitation and Valuation of Motor Skills – Abstract+References

Abstract

This article proposes a non-invasive system to stimulate the rehabilitation of motor skills, both of the upper limbs and lower limbs. The system contemplates two ambiances for human-computer interaction, depending on the type of motor deficiency that the patient possesses, i.e., for patients with chronic injuries, an augmented reality environment is considered, while virtual reality environments are used in people with minor injuries. In the cases mentioned, the interface allows visualizing both the routine of movements performed by the patient and the actual movement executed by him.

This information is relevant for the purpose of

  • (i) stimulating the patient during the execution of rehabilitation, and
  • (ii) evaluation of the movements made so that the therapist can diagnose the progress of the patient’s rehabilitation process.

The visual environment developed for this type of rehabilitation provides a systematic application in which the user first analyzes and generates the necessary movements in order to complete the defined task.

The results show the efficiency of the system generated by the human-computer interaction oriented to the development of motor skills.

References

Source: Assistance System for Rehabilitation and Valuation of Motor Skills | SpringerLink

, , , , , ,

Leave a comment

[Abstract] Architecture guideline for game-based stroke rehabilitation

Abstract:

Strokes are the most common cause of long-term disability of adults in developed countries. Continuous participation in rehabilitation can alleviate some of the consequences, and support recovery of stroke patients. However, physical rehabilitation requires commitment to tedious exercise routines over lengthy periods of time, which often cause patients to drop out of this form of therapy. In this context, game-based stroke rehabilitation has the potential to address two important barriers: accessibility of rehabilitation, and patient motivation.

This paper provides a review of design efforts in human-computer interaction (HCI) and gaming research to support stroke rehabilitation.

Source: Architecture guideline for game-based stroke rehabilitation: World Journal of Science, Technology and Sustainable Development: Vol 14, No 2/3

, , , ,

Leave a comment

[Thesis] Exploring In-Home Monitoring of Rehabilitation and Creating an Authoring Tool for Physical Therapists – Full Text PDF

Abstract

Physiotherapy is a key part of treatment for neurological and musculoskeletal disorders, which affect millions in the U.S. each year. Physical therapy treatments typically consist of an initial diagnostic session during which patients’ impairments are assessed and exercises are prescribed to improve the impaired functions. As part of the treatment program, exercises are often assigned to be performed at home daily. Patients return to the clinic weekly or biweekly for check-up visits during which the physical therapist reassesses their condition and makes further treatment decisions, including readjusting the exercise prescriptions.

Most physical therapists work in clinics or hospitals. When patients perform their exercises at home, physical therapists cannot supervise them and lack quantitative exercise data reflecting the patients’ exercise compliance and performance. Without this information, it is difficult for physical therapists to make informed decisions or treatment adjustments. To make informed decisions, physical therapists need to know how often patients exercise, the duration and/or repetitions of each session, exercise metrics such as the average velocities and ranges of motion for each exercise, patients’ symptom levels (e.g. pain or dizziness) before and after exercise, and what mistakes patients make.

In this thesis, I evaluate and work towards a solution to this problem. The growing ubiquity of mobile and wearable technology makes possible the development of “virtual rehabilitation assistants.” Using motion sensors such as accelerometers and gyroscopes that are embedded in a wearable device, the “assistant” can mediate between patients at home and physical therapists in the clinic. Its functions are to:

  • use motion sensors to record home exercise metrics for compliance and performance and report these metrics to physical therapists in real-time or periodically;
  • allow physical therapists and patients to quantify and see progress on a fine-grain level;
  • record symptom levels to further help physical therapists gauge the effectiveness of exercise prescriptions;
  • offer real-time mistake recognition and feedback to the patients during exercises;

One contribution of this thesis is an evaluation of the feasibility of this idea in real home settings. Because there has been little research on wearable virtual assistants in patient homes, there are many unanswered questions regarding their use and usefulness:

  • Q1. What patient in-home data could wearable virtual assistants gather to support physical therapy treatments?
  • Q2. Can patient data gathered by virtual assistants be useful to physical therapists?
  • Q3. How is this wearable in-home technology received by patients?

I sought to answer these questions by implementing and deploying a prototype called “SenseCap.” SenseCap is a small mobile device worn on a ball cap that monitors patients’ exercise movements and queries them about their symptoms. A technology probe study showed that the virtual assistant could gather important compliance, performance, and symptom data to assist physical therapists’ decision-making, and that this technology would be feasible and acceptable for in-home use by patients.

Another contribution of this thesis is the development of a tool to allow physical therapists to create and customize virtual assistants. With current technology, virtual assistants require engineering and programming efforts to design, implement, configure and deploy them. Because most physical therapists do not have access to an engineering team they and their patients would be unable to benefit from this technology. With the goal of making virtual assistants accessible to any physical therapist, I explored the following research questions:

  • Q4. Would a user-friendly rule-specification interface make it easy for physical therapists to specify correct and incorrect exercise movements directly to a computer? What are the limitations of this method of specifying exercise rules?
  • Q5. Is it possible to create a CAD-type authoring tool, based on a usable interface, that physical therapists could use to create their own customized virtual assistant for monitoring and coaching patients? What are the implementation details of such a system and the resulting virtual assistant?
  • Q6. What preferences do PTs have regarding the delivery of coaching feedback for patients?
  • Q7. What is the recognition accuracy of a virtual rehabilitation assistant created by this tool?

This dissertation research aims to improve our understanding of the barriers to rehabilitation that occur because of the invisibility of home exercise behavior, to lower these barriers by making it possible for patients to use a widely-available and easily-used wearable device that coaches and monitors them while they perform their exercises, and improve the ability of physical therapists to create an exercise regime for their patients and to learn what patients have done to perform these exercises. In doing so, treatment should be better suited to each patient and more successful.

To Continue —> Download Full Text PDF

 

, , , , , , , ,

Leave a comment

[ARTICLE] Movement-Based Interaction Applied to Physical Rehabilitation Therapies – Full Text HTML

ABSTRACT

Background: Health care environments are continuously improving conditions, especially regarding the use of current technology. In the field of rehabilitation, the use of video games and related technology has helped to develop new rehabilitation procedures. Patients are able to work on their disabilities through new processes that are more motivating and entertaining. However, these patients are required to leave their home environment to complete their rehabilitation programs.

Objective: The focus of our research interests is on finding a solution to eliminate the need for patients to interrupt their daily routines to attend rehabilitation therapy. We have developed an innovative system that allows patients with a balance disorder to perform a specific rehabilitation exercise at home. Additionally, the system features an assistive tool to complement the work of physiotherapists. Medical staff are thus provided with a system that avoids the need for them to be present during the exercise in specific cases in which patients are under suitable supervision.

Methods: A movement-based interaction device was used to achieve a reliable system for monitoring rehabilitation exercises performed at home. The system accurately utilizes parameters previously defined by the specialist for correct performance of the exercise. Accordingly, the system gives instructions and corrects the patient’s actions. The data generated during the session are collected for assessment by the specialist to adapt the difficulty of the exercise to the patient’s progress.

Results: The evaluation of the system was conducted by two experts in balance disorder rehabilitation. They were required to verify the effectiveness of the system, and they also facilitated the simulation of real patient behavior. They used the system freely for a period of time and provided interesting and optimistic feedback. First, they evaluated the system as a tool for real-life rehabilitation therapy. Second, their interaction with the system allowed us to obtain important feedback needed to improve the system.

Conclusions: The system improves the rehabilitation conditions of people with balance disorder. The main contribution comes from the fact that it allows patients to carry out the rehabilitation process at home under the supervision of physiotherapists. As a result, patients avoid having to attend medical centers. Additionally, medical staff have access to an assistant, which means their presence is not required in many exercises that involve constant repetition.

Full Text HTML –> JMIR-Movement-Based Interaction Applied to Physical Rehabilitation Therapies | Garrido Navarro | Journal of Medical Internet Research.

, , , , ,

Leave a comment

ARTICLE: Human computer interactive system for fast recovery based stroke rehabilitation

…One way to improve the stroke rehabilitation process is through human interactive system, which can be achieved by augmented reality technology. This development draws from the work currently being pursued in the gaming industry to make the augmented reality technology more accessible to the medical industry for the improvement of stroke rehabilitation. In this paper, two augmented reality games: Pong Game and Goal Keeper Game were developed. These games have been designed for rehabilitation with consideration to human interactive systems and have features such as on-screen feedbacks and high immersive value to keep stroke victims motivated in the rehabilitation process. The developed games were aimed to replace boring and repetitive traditional rehabilitation exercises. This paper details the success of implementing augmented reality into the rehabilitation process, which will in turn contribute to society by minimizing the number of people living at home with stroke related disabilities and the requirement for direct supervision from therapist…

via IEEE Xplore Abstract (Abstract) – Human computer interactive system for fast recovery based stroke rehabilitation.

, , , , , , , ,

Leave a comment

%d bloggers like this: