Posts Tagged oculus rift

[ARTICLE] BCI and FES Based Therapy for Stroke Rehabilitation Using VR Facilities – Full Text

Abstract

In recent years, the assistive technologies and stroke rehabilitation methods have been empowered by the use of virtual reality environments and the facilities offered by brain computer interface systems and functional electrical stimulators. In this paper, a therapy system for stroke rehabilitation based on these revolutionary techniques is presented. Using a virtual reality Oculus Rift device, the proposed system ushers the patient in a virtual scenario where a virtual therapist coordinates the exercises aimed at restoring brain function. The electrical stimulator helps the patient to perform rehabilitation exercises and the brain computer interface system and an electrooculography device are used to determine if the exercises are executed properly. Laboratory tests on healthy people led to system validation from technical point of view. The clinical tests are in progress, but the preliminary results of the clinical tests have highlighted the good satisfaction degree of patients, the quick accommodation with the proposed therapy, and rapid progress for each user rehabilitation.

1. Introduction

The worldwide statistics reported by World Health Organization highlight that stroke is the third leading cause of death and about 15 million people suffer stroke worldwide each year ‎[1]. Of these, 5 million are permanently disabled needing long time assistance and only 5 million are considered socially integrated after recovering. Recovering from a stroke is a difficult and long process that requires patience, commitment, and access to various assistive technologies and special devices. Rehabilitation is an important part of recovering and helps the patient to keep abilities or gain back lost abilities in order to become more independent. Taking into account the depression installed after stroke, it is very important for a patient to benefit from an efficient and fast rehabilitation program followed by a quick return to community living ‎[2]. In the last decade, many research groups are focused on motor, cognitive, or speech recovery after stroke like Stroke Centers from Johns Hopkins Institute ‎[3], ENIGMA-Stroke Recovery ‎[4], or StrokeBack Consortium funded by European Union’s Seventh Framework Programme ‎[5]. Important ICT companies bring a major contribution to the development of technologies and equipment that can be integrated into rehabilitation systems. For example, Stroke Recovery with Kinect is a research project to build an interactive and home-rehabilitation system for motor recovery after a stroke based on Microsoft Kinect technology ‎[6].

In the last years, the virtual reality (VR) applications received a boost in development due to VR headset prices that dropped below $1000, allowing them to become a mass-market product ‎[7]. The VR was and still is especially used for military training or video games to provide some sense of realism and interaction with the virtual environment to its users ‎[8]. Now it attracts more and more the interest of physicians and therapist which are exploring the potential of VR headset and augmented reality (AR) to improve the neuroplasticity of the brain, to be used in neurorehabilitation and treatment of motor/mental disorders ‎[9]. However, considering the diversity of interventions and methods used, there is no evidence that VR therapy alone can be efficacious compared with other traditional therapies for a particular type of impairment ‎[10]. This does not mean that the potential of VR was overestimated and the results are not the ones that were expected. The VR therapy must be complemented with other forms of rehabilitation technologies like robotic therapy, brain computer interface (BCI) and functional electrical stimulation (FES) therapy, and nevertheless traditional therapy to provide a more targeted approach ‎[11].

SaeboVR is a virtual rehabilitation system exclusively focusing on activities of daily living and uses a virtual assistant that appears on the screen to educate and facilitate performance by providing real-time feedback ‎[12]. The neurotechnology company MindMaze has introduced MindMotion PRO, a 3D virtual environment therapy for upper limb neurorehabilitation incorporating virtual reality-based physical and cognitive exercise games into stroke rehabilitation programs ‎[13]. At New York Dynamic Neuromuscular Rehabilitation, the CAREN (Computer Assisted Rehabilitation Environment) based on VR is currently used to treat patients poststroke and postbrains injuries ‎[14]. EVREST Multicentre has achieved remarkable results regarding the use of VR exercises in stroke rehabilitation ‎[15].

Motor imagery (MI) is a technique used in poststroke rehabilitation for a long time ago. One of its major problems was that there was not an objective method to determine whether the user is performing the expected movement imagination. MI-based BCIs can quantify the motor imagery and output signals that can be used for controlling an external device such as a wheelchair, neuroprosthesis, or computer. The FES therapy combined with MI-based BCI became a promising technique for stroke rehabilitation. Instead of providing communication, in this case, MI is used to induce closed-loop feedback within conventional poststroke rehabilitation therapy. This approach is called paired stimulation (PS) due to the fact that it pairs each user’s motor imagery with stimulation and feedback, such as activation of a functional electrical stimulator (FES), avatar movement, and/or auditory feedback ‎[16]. Recent research from many groups showed that MI can be recorded in the clinical environment from patients and used to control real-time feedback and at the same time, they support the hypothesis that PS could improve the rehabilitation therapy outcome ‎[1721].

In a recent study, Irimia et al. ‎[22] have proved the efficacy of combining motor imagery, bar feedback, and real hand movements by testing a system combining a MI-based BCI and a neurostimulator on three stroke patients. In every session, the patients had to imagine 120 left-hand and 120 right-hand movements. The visual feedback was provided in form of an extending bar on the screen. During the trials where the correct imagination was classified, the FES was activated in order to induce the opening of the corresponding hand. All patients achieved high control accuracies and exhibited improvements in motor function. In a later study, Cho et al. ‎[23] present the results of two patients who performed the BCI training with first-person avatar feedback. After the study, both patients reported improvements in motor functions and both have improved their scores on Upper Extremity Fugl-Meyer Assessment scale. Even if the number of patients presented in these two studies is low, they support the idea that this kind of systems may bring additional benefits to the rehabilitation process outcome in stroke patients.

2. General System Architecture

The BCI-FES technique presented in this paper is part of a much more complex system designed for stroke rehabilitation called TRAVEE ‎[24], presented in Figure 1. The stimulation devices, the monitoring devices, the VR headset, and a computer running the software are the main modules of the TRAVEE system. The stimulation devices help the patient to perform the exercises and the monitoring devices are used to determine if the exercises are executed properly, according to the proposed scenarios. Actually, the TRAVEE system must be seen as a software kernel that allows defining a series of rehabilitation exercises using a series of USB connectable devices. This approach is very useful because it offers the patient the options to buy, borrow, or rent the abovementioned devices according to his needs and after connection, the therapist may choose the suitable set of exercises.[…]

 

Continue —> BCI and FES Based Therapy for Stroke Rehabilitation Using VR Facilities

, , , , , , , ,

Leave a comment

[WEB SITE] How Virtual Avatars Help Stroke Patients Improve Motor Function

At USC, Dr. Sook-Lei Liew is testing whether watching a virtual avatar that moves in response to brain commands can activate portions of the brain damaged by stroke.
Dr. Sook-Lei Liew (Photo: Nate Jensen)

Photo: Nate Jensen

I am hooked up to a 16-channel brain machine interface with 12 channels of EEG on my head and ears and four channels of electromyography (EMG) on my arms. An Oculus Rift occludes my vision.

Two inertial measurement units (IMU) are stuck to my wrists and forearms, tracking the orientation of my arms, while the EMG monitors my electrical impulses and peripheral nerve activity.

Dr. Sook-Lei Liew, Director of USC’s Neural Plasticity and Neurorehabilitation Laboratory, and Julia Anglin, Research Lab Supervisor and Technician, wait to record my baseline activity and observe a monitor with a representation of my real arm and a virtual limb. I see the same image from inside the Rift.

“Ready?” asks Dr. Liew. “Don’t move—or think.”

I stay still, close my eyes, and let my mind go blank. Anglin records my baseline activity, allowing the brain-machine interface to take signals from the EEG and EMG, alongside the IMU, and use that data to inform an algorithm that drives the virtual avatar hand.

“Now just think about moving your arm to the avatar’s position,” says Dr. Liew.

I don’t move a muscle, but think about movement while looking at the two arms on the screen. Suddenly, my virtual arm moves toward the avatar appendage inside the VR world.

VR rehab at USC

Something happened just because I thought about it! I’ve read tons of data on how this works, even seen other people do it, especially inside gaming environments, but it’s something else to experience it for yourself.

“Very weird isn’t it?” says David Karchem, one of Dr. Liew’s trial patients. Karchem suffered a stroke while driving his car eight years ago, and has shown remarkable recovery using her system.

“My stroke came out of the blue and it was terrifying, because I suddenly couldn’t function. I managed to get my car through an intersection and call the paramedics. I don’t know how,” Karchem says.

He gets around with a walking stick today, and has relatively normal function on the right side of his body. However, his left side is clearly damaged from the stroke. While talking, he unwraps surgical bandages and a splint from his left hand, crooked into his chest, to show Dr. Liew the progress since his last VR session.

As a former software engineer, Karchem isn’t fazed by using advanced technology to aid the clinical process. “I quickly learned, in fact, that the more intellectual and physical stimulation you get, the faster you can recover, as the brain starts to fire. I’m something of a lab rat now and I love it,” he says.

REINVENT Yourself

Karchem is participating in Dr. Liew’s REINVENT (Rehabilitation Environment using the Integration of Neuromuscular-based Virtual Enhancements for Neural Training) project, funded by the American Heart Association, under a National Innovative Research Grant. It’s designed to help patients who have suffered strokes reconnect their brains to their bodies.

VR rehab at USC (Photo: Nate Jensen)“My PhD in Occupational Science, with a concentration in Cognitive Neuroscience, focused on how experience changes brain networks,” explains Dr. Liew. “I continued this work as a Postdoctoral Fellow at the National Institute of Neurological Disorders and Stroke at the National Institutes of Health, before joining USC, in my current role, in 2015.

“Our main goal here is to enhance neural plasticity or neural recovery in individuals using noninvasive brain stimulation, brain-computer interfaces and novel learning paradigms to improve patients’ quality of life and engagement in meaningful activities,” she says.

Here’s the science bit: the human putative mirror neuron system (MNS) is a key motor network in the brain that is active both when you perform an action, like moving your arm, and when you simply watch someone else—like a virtual avatar—perform that same action. Dr. Liew hypothesizes that, for stroke patients who can’t move their arm, simply watching a virtual avatar that moves in response to their brain commands will activate the MNS and retrain damaged or neighboring motor regions of the brain to take over the role of motor performance. This should lead to improved motor function.

“In previous occupational therapy sessions, we found many people with severe strokes got frustrated because they didn’t know if they were activating the right neural networks when we asked them to ‘think about moving’ while we physically helped them to do so,” Dr. Liew says. “If they can’t move at all, even if the right neurological signals are happening, they have no biological feedback to reinforce the learning and help them continue the physical therapy to recover.”

For many people, the knowledge that there’s “intent before movement”—in that the brain has to “think” about moving before the body will do so, is news. We also contain a “body map” inside our heads that predicts our spacetime presence in the world (so we don’t bash into things all the time and know when something is wrong). Both of these brain-body elements face massive disruption after a stroke. The brain literally doesn’t know how to help the body move.

What Dr. Liew’s VR platform has done is show patients how this causal link works and aid speedier, and less frustrating, recovery in real life.

From the Conference Hall to the Lab

She got the idea while geeking out in Northern California one day.

“I went to the Experiential Technology Conference in San Francisco in 2015, and saw demos of intersections of neuroscience and technology, including EEG-based experiments, wearables, and so on. I could see the potential to help our clinical population by building a sensory-visual motor contingency between your own body and an avatar that you’re told is ‘you,’ which provides rewarding sensory feedback to reestablish brain-body signals.

“Inside VR you start to map the two together, it’s astonishing. It becomes an automatic process. We have seen that people who have had a stroke are able to ’embody’ an avatar that does move, even though their own body, right now, cannot,” she says.

VR rehab at USC

Dr. Liew’s system is somewhat hacked together, in the best possible Maker Movement style; she built what didn’t exist and modified what did to her requirements.

“We wanted to keep costs low and build a working device that patients could actually afford to buy. We use Oculus for the [head-mounted display]. Then, while most EEG systems are $10,000 or more, we used an OpenBCI system to build our own, with EMG, for under $1,000.

“We needed an EEG cap, but most EEG manufacturers wanted to charge us $200 or more. So, we decided to hack the rest of the system together, ordering a swim cap from Amazon, taking a mallet and bashing holes in it to match up where the 12 positions on the head electrodes needed to be placed (within the 10-10 international EEG system). We also 3D print the EEG clips and IMU holders here at the lab.

VR rehab at USC

“For the EMG, we use off-the-shelf disposable sensors. This allows us to track the electromyography, if they do have trace muscular activity. In terms of the software platform, we coded custom elements in C#, from Microsoft, and implemented them in the Unity3D game engine.”

Dr. Liew is very keen to bridge the gap between academia and the tech industry; she just submitted a new academic paper with the latest successful trial results from her work for publication. Last year, she spoke at SXSW 2017 about how VR affects the brain, and debuted REINVENT at the conference’s VR Film Festival. It received a “Special Jury Recognition for Innovative Use of Virtual Reality in the Field of Health.”

Going forward, Dr. Liew would like to bring her research to a wider audience.

RELATED

“I feel the future of brain-computer interfaces splits into adaptive, as with implanted electrodes, and rehabilitative, which is what we work on. What we hope to do with REINVENT is allow patients to use our system to re-train their neural pathways, [so they] eventually won’t need it, as they’ll have recovered.

“We’re talking now about a commercial spin-off potential. We’re able to license the technology right now, but, as researchers, our focus, for the moment, is in furthering this field and delivering more trial results in published peer-reviewed papers. Once we have enough data we can use machine learning to tailor the system precisely for each patient and share our results around the world.”

If you’re in L.A., Dr. Liew and her team will be participating in the Creating Reality VR Hackathon from March 12-15 at USC. Details here.

via How Virtual Avatars Help Stroke Patients Improve Motor Function | News & Opinion | PCMag.com

, , , , , , , , ,

Leave a comment

[Abstract+References] Impact of commercial sensors in human computer interaction: a review

Abstract

Nowadays, the communication gap between humans and computers might be reduced due to multimodal sensors available in the market. Therefore, it is important to know the specifications of these sensors and how they are being used in order to create human computer interfaces, which tackle complex tasks. The purpose of this paper is to review recent research regarding the up-to-date application areas of the following sensors:

(1) Emotiv sensor, which identifies emotions, facial expressions, thoughts, and head movements from users through electroencephalography signals,

(2) Leap motion controller, which recognizes hand and arm movements via vision techniques,

(3) Myo armband, which identifies hand and arm movements using electromyography signals and inertial sensors, and

(4) Oculus rift, which provides immersion into virtual reality to users.

The application areas discussed in this manuscript go from assistive technology to virtual tours. Finally, a brief discussion regarding advantages and shortcomings of each sensor is presented.

References

  1. Abreu JG, Teixeira JM, Figueiredo LS, Teichrieb V (2016) Evaluating sign language recognition using the myo armband. In: Virtual and augmented reality (SVR), 2016 XVIII symposium on, IEEE, pp 64–70Google Scholar
  2. Bassily D, Georgoulas C, Guettler J, Linner T, Bock T (2014) Intuitive and adaptive robotic arm manipulation using the Leap motion controller. In: ISR/Robotik 2014; 41st international symposium on robotics; proceedings of, VDE, pp 1–7Google Scholar
  3. Bernardos AM, Sánchez JM, Portillo JI, Wang X, Besada JA, Casar JR (2016) Design and deployment of a contactless hand-shape identification system for smart spaces. J Ambient Intell Humaniz Comput 7(3):357–370CrossRefGoogle Scholar
  4. Blaha J, Gupta M (2014) Diplopia: A virtual reality game designed to help amblyopics. In: Virtual reality (VR), 2014 iEEE, IEEE, pp 163–164Google Scholar
  5. Boschmann A, Dosen S, Werner A, Raies A, Farina D (2016) A novel immersive augmented reality system for prosthesis training and assessment. In: Biomedical and health informatics (BHI), 2016 IEEE-EMBS international conference on, IEEE, pp 280–283Google Scholar
  6. Brennan CP, McCullagh PJ, Galway L, Lightbody G (2015) Promoting autonomy in a smart home environment with a smarter interface. In: Engineering in medicine and biology society (EMBC), 2015 37th annual international conference of the IEEE, IEEE, pp 5032–5035Google Scholar
  7. Cacace J, Finzi A, Lippiello V, Furci M, Mimmo N, Marconi L (2016) A control architecture for multiple drones operated via multimodal interaction in search & rescue mission. In: Safety, security, and rescue robotics (SSRR), 2016 IEEE international symposium on, IEEE, pp 233–239Google Scholar
  8. Carrino F, Tscherrig J, Mugellini E, Khaled OA, Ingold R (2011) Head-computer interface: a multimodal approach to navigate through real and virtual worlds. In: International conference on human-computer interaction, Springer, pp 222–230Google Scholar
  9. Charles D, Pedlow K, McDonough S, Shek K, Charles T (2014) Close range depth sensing cameras for virtual reality based hand rehabilitation. J Assist Technol 8(3):138–149CrossRefGoogle Scholar
  10. Chuan CH, Regina E, Guardino C (2014) American sign language recognition using Leap motion sensor. In: Machine learning and applications (ICMLA), 2014 13th international conference on, IEEE, pp 541–544Google Scholar
  11. Ciolan IM, Buraga SC, Dafinoiu I (2016) Oculus rift 3D interaction and nicotine craving: results from a pilot study. In: ROCHI–international conference on human-computer interaction, p 58Google Scholar
  12. Da Gama A, Fallavollita P, Teichrieb V, Navab N (2015) Motor rehabilitation using Kinect: a systematic review. Games Health J 4(2):123–135CrossRefGoogle Scholar
  13. dos Reis Alves SF, Uribe-Quevedo AJ, da Silva IN, Ferasoli Filho H (2014) Pomodoro, a mobile robot platform for hand motion exercising. In: Biomedical robotics and biomechatronics 2014 5th IEEE RAS & EMBS international conference on, IEEE, pp 970–974Google Scholar
  14. Duvinage M, Castermans T, Petieau M, Hoellinger T, Cheron G, Dutoit T (2013) Performance of the emotiv epoc headset for P300-based applications. Biomed Eng Online 12(1):56CrossRefGoogle Scholar
  15. Farahani N, Post R, Duboy J, Ahmed I, Kolowitz BJ, Krinchai T, Monaco SE, Fine JL, Hartman DJ, Pantanowitz L (2016) Exploring virtual reality technology and the Oculus rift for the examination of digital pathology slides. J Pathol Inform 7Google Scholar
  16. Fiałek S, Liarokapis F (2016) Comparing two commercial brain computer interfaces for serious games and virtual environments. In: Karpouzis K, Yannakakis GN (eds) Emotion in games, Springer, Switzerland, pp 103–117Google Scholar
  17. Funasaka M, Ishikawa Y, Takata M, Joe K (2015) Sign language recognition using Leap motion controller. In: Proceedings of the international conference on parallel and distributed processing techniques and applications (PDPTA), the steering committee of the world congress in computer science, computer engineering and applied computing (WorldComp), p 263Google Scholar
  18. Gándara CV, Bauza CG (2015) Intellihome: a framework for the development of ambient assisted living applications based in low-cost technology. In: Proceedings of the Latin American conference on human computer interaction, ACM, p 18Google Scholar
  19. Gomez-Gil J, San-Jose-Gonzalez I, Nicolas-Alonso LF, Alonso-Garcia S (2011) Steering a tractor by means of an EMG-based human-machine interface. Sensors 11(7):7110–7126CrossRefGoogle Scholar
  20. Gonzalez-Sanchez J, Chavez-Echeagaray ME, Atkinson R, Burleson W (2011) Abe: an agent-based software architecture for a multimodal emotion recognition framework. In: Software architecture (WICSA), 2011 9th working IEEE/IFIP conference on, IEEE, pp 187–193Google Scholar
  21. Grubišić I, Skala Kavanagh H, Grazio S (2015) Novel approaches in hand rehabilitation. Period Biol 117(1):139–145Google Scholar
  22. Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the Leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702–3720CrossRefGoogle Scholar
  23. Gunasekera WL, Bendall J (2005) Rehabilitation of neurologically injured patients. In: Moore AJ, Newell DW (eds) Neurosurgery, Springer, London, pp 407–421Google Scholar
  24. Güttler J, Shah R, Georgoulas C, Bock T (2015) Unobtrusive tremor detection and measurement via human-machine interaction. Proced Comput Sci 63:467–474CrossRefGoogle Scholar
  25. Han J, Shao L, Xu D, Shotton J (2013) Enhanced computer vision with Microsoft Kinect sensor: a review. IEEE Trans Cybern 43(5):1318–1334CrossRefGoogle Scholar
  26. Hettig J, Mewes A, Riabikin O, Skalej M, Preim B, Hansen C (2015) Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Proceedings of the eurographics workshop on visual computing for biology and medicine, eurographics association, pp 177–185Google Scholar
  27. Ijjada MS, Thapliyal H, Caban-Holt A, Arabnia HR (2015) Evaluation of wearable head set devices in older adult populations for research. In: Computational science and computational intelligence (CSCI), 2015 international conference on, IEEE, pp 810–811Google Scholar
  28. Jurcak V, Tsuzuki D, Dan I (2007) 10/20, 10/10, and 10/5 systems revisited: their validity as relative head-surface-based positioning systems. Neuroimage 34(4):1600–1611CrossRefGoogle Scholar
  29. Kefer K, Holzmann C, Findling RD (2016) Comparing the placement of two arm-worn devices for recognizing dynamic hand gestures. In: Proceedings of the 14th international conference on advances in mobile computing and multi media, ACM, pp 99–104Google Scholar
  30. Khademi M, Mousavi Hondori H, McKenzie A, Dodakian L, Lopes CV, Cramer SC (2014) Free-hand interaction with Leap motion controller for stroke rehabilitation. In: Proceedings of the extended abstracts of the 32nd annual ACM conference on human factors in computing systems, ACM, pp 1663–1668Google Scholar
  31. Khan FR, Ong HF, Bahar N (2016) A sign language to text converter using Leap motion. Int J Adv Sci Eng Inf Technol 6(6):1089–1095Google Scholar
  32. Kim SY, Kim YY (2012) Mirror therapy for phantom limb pain. Korean J Pain 25(4):272–274CrossRefGoogle Scholar
  33. Kiorpes L, McKeet SP (1999) Neural mechanisms underlying amblyopia. Curr Opin Neurobiol 9(4):480–486CrossRefGoogle Scholar
  34. Kleven NF, Prasolova-Førland E, Fominykh M, Hansen A, Rasmussen G, Sagberg LM, Lindseth F (2014) Training nurses and educating the public using a virtual operating room with Oculus rift. In: Virtual systems & multimedia (VSMM), 2014 international conference on, IEEE, pp 206–213Google Scholar
  35. Kutafina E, Laukamp D, Bettermann R, Schroeder U, Jonas SM (2016) Wearable sensors for elearning of manual tasks: Using forearm emg in hand hygiene training. Sensors 16(8):1221CrossRefGoogle Scholar
  36. Li C, Rusak Z, Horvath I, Kooijman A, Ji L (2016) Implementation and validation of engagement monitoring in an engagement enhancing rehabilitation system. IEEE Trans Neural Syst Rehabil Eng 25(6):726–738CrossRefGoogle Scholar
  37. Li C, Yang C, Wan J, Annamalai AS, Cangelosi A (2017) Teleoperation control of baxter robot using kalman filter-based sensor fusion. Syst Sci Control Eng 5(1):156–167CrossRefGoogle Scholar
  38. Liarokapis F, Debattista K, Vourvopoulos A, Petridis P, Ene A (2014) Comparing interaction techniques for serious games through brain-computer interfaces: a user perception evaluation study. Entertain Comput 5(4):391–399CrossRefGoogle Scholar
  39. Lupu RG, Ungureanu F, Stan A (2016) A virtual reality system for post stroke recovery. In: System theory, control and computing (ICSTCC), 2016 20th international conference on, IEEE, pp 300–305Google Scholar
  40. Marin G, Dominio F, Zanuttigh P (2014) Hand gesture recognition with Leap motion and Kinect devices. In: Image processing (ICIP), 2014 IEEE international conference on, IEEE, pp 1565–1569Google Scholar
  41. McCullough M, Xu H, Michelson J, Jackoski M, Pease W, Cobb W, Kalescky W, Ladd J, Williams B (2015) Myo arm: swinging to explore a VE. In: Proceedings of the ACM SIGGRAPH symposium on applied perception, ACM, pp 107–113Google Scholar
  42. Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C (2016) A gesture-controlled projection display for CT-guided interventions. Int J Comput Assist Radiol Surg 11(1):157–164CrossRefGoogle Scholar
  43. Mousavi Hondori H, Khademi M (2014) A review on technical and clinical impact of Microsoft Kinect on physical therapy and rehabilitation. J Med Eng 2014. doi:10.1155/2014/846514
  44. Nicola Bizzotto M, Alessandro Costanzo M, Leonardo Bizzotto M (2014) Leap motion gesture control with osirix in the operating room to control imaging: first experiences during live surgery. Surg Innov 1:2Google Scholar
  45. Nugraha BT, Sarno R, Asfani DA, Igasaki T, Munawar MN (2016) Classification of driver fatigue state based on EEG using Emotiv EPOC+. J Theor Appl Inf Technol 86(3):347Google Scholar
  46. Oskoei MA, Hu H (2007) Myoelectric control systems: a survey. Biomed Sign Process Control 2(4):275–294CrossRefGoogle Scholar
  47. Palmisano S, Mursic R, Kim J (2017) Vection and cybersickness generated by head-and-display motion in the Oculus rift. Displays 46:1–8CrossRefGoogle Scholar
  48. Phelan I, Arden M, Garcia C, Roast C (2015) Exploring virtual reality and prosthetic training. In: Virtual reality (VR), 2015 IEEE, IEEE, pp 353–354Google Scholar
  49. Powell C, Hatt SR (2009) Vision screening for amblyopia in childhood. Cochrane Database Syst Rev. doi:10.1002/14651858.CD005020.pub3
  50. Qamar A, Rahman MA, Basalamah S (2014) Adding inverse kinematics for providing live feedback in a serious game-based rehabilitation system. In: Intelligent systems, modelling and simulation (ISMS), 2014 5th international conference on, IEEE, pp 215–220Google Scholar
  51. Qamar AM, Khan AR, Husain SO, Rahman MA, Baslamah S (2015) A multi-sensory gesture-based occupational therapy environment for controlling home appliances. In: Proceedings of the 5th ACM on international conference on multimedia retrieval, ACM, pp 671–674Google Scholar
  52. Quesada L, López G, Guerrero L (2017) Automatic recognition of the american sign language fingerspelling alphabet to assist people living with speech or hearing impairments. J Ambient Intell Humaniz Comput 8(4):625–635Google Scholar
  53. Ramachandran VS, Rogers-Ramachandran D (2008) Sensations referred to a patient’s phantom arm from another subjects intact arm: perceptual correlates of mirror neurons. Med Hypotheses 70(6):1233–1234CrossRefGoogle Scholar
  54. Ranky G, Adamovich S (2010) Analysis of a commercial EEG device for the control of a robot arm. In: Bioengineering conference, proceedings of the 2010 IEEE 36th annual northeast, IEEE, pp 1–2Google Scholar
  55. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 43(1):1–54CrossRefGoogle Scholar
  56. Rechy-Ramirez EJ, Hu H (2014) A flexible bio-signal based HMI for hands-free control of an electric powered wheelchair. Int J Artif Life Res (IJALR) 4(1):59–76CrossRefGoogle Scholar
  57. Simoens P, De Coninck E, Vervust T, Van Wijmeersch JF, Ingelbinck T, Verbelen T, Op de Beeck M, Dhoedt B (2014) Vision: smart home control with head-mounted sensors for vision and brain activity. In: Proceedings of the fifth international workshop on Mobile cloud computing & services, ACM, pp 29–33Google Scholar
  58. Snow PW, Loureiro RC, Comley R (2014) Design of a robotic sensorimotor system for phantom limb pain rehabilitation. In: Biomedical robotics and biomechatronics 2014 5th IEEE RAS & EMBS international conference on, IEEE, pp 120–125Google Scholar
  59. Sonntag D, Orlosky J, Weber M, Gu Y, Sosnovsky S, Toyama T, Toosi EN (2015) Cognitive monitoring via eye tracking in virtual reality pedestrian environments. In: Proceedings of the 4th international symposium on pervasive displays, ACM, pp 269–270Google Scholar
  60. Subha DP, Joseph PK, Acharya R, Lim CM (2010) EEG signal analysis: a survey. J Med Syst 34(2):195–212CrossRefGoogle Scholar
  61. Toutountzi T, Collander C, Phan S, Makedon F (2016) Eyeon: An activity recognition system using myo armband. In: Proceedings of the 9th ACM international conference on PErvasive technologies related to assistive environments, ACM, p 82Google Scholar
  62. Verkijika SF, De Wet L (2015) Using a brain-computer interface (BCI) in reducing math anxiety: evidence from South Africa. Comput Educ 81:113–122CrossRefGoogle Scholar
  63. Vikram S, Li L, Russell S (2013) Handwriting and gestures in the air, recognizing on the fly. Proc CHI 13:1179–1184Google Scholar
  64. Villagrasa S, Fonseca D, Durán J (2014) Teaching case: applying gamification techniques and virtual reality for learning building engineering 3D arts. In: Proceedings of the second international conference on technological ecosystems for enhancing multiculturality, ACM, pp 171–177Google Scholar
  65. Wake N, Sano Y, Oya R, Sumitani M, Kumagaya Si, Kuniyoshi Y (2015) Multimodal virtual reality platform for the rehabilitation of phantom limb pain. In: Neural engineering (NER), 2015 7th international IEEE/EMBS conference on, IEEE, pp 787–790Google Scholar
  66. Webel S, Olbrich M, Franke T, Keil J (2013) Immersive experience of current and ancient reconstructed cultural attractions. In: Digital heritage international congress (DigitalHeritage), 2013, IEEE, vol 1, pp 395–398Google Scholar
  67. Webster D, Celik O (2014) Systematic review of Kinect applications in elderly care and stroke rehabilitation. J Neuroeng Rehabil 11(1):108CrossRefGoogle Scholar
  68. Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the Leap motion controller. Sensors 13(5):6380–6393CrossRefGoogle Scholar
  69. Weisz J, Shababo B, Dong L, Allen PK (2013) Grasping with your face. In: Desai JP, Dudek G, Khatib O, Kumar V (eds) Experimental robotics, Springer, Heidelberg, pp 435–448Google Scholar
  70. Yu N, Xu C, Wang K, Yang Z, Liu J (2015) Gesture-based telemanipulation of a humanoid robot for home service tasks. In: Cyber technology in automation, control, and intelligent systems (CYBER), 2015 IEEE international conference on, IEEE, pp 1923–1927Google Scholar
  71. Zecca M, Micera S, Carrozza MC, Dario P (2002) Control of multifunctional prosthetic hands by processing the electromyographic signal. Crit Rev Biomed Eng 30:4–6CrossRefGoogle Scholar
  72. Zyda M (2005) From visual simulation to virtual reality to games. Computer 38(9):25–32CrossRefGoogle Scholar

Source: Impact of commercial sensors in human computer interaction: a review | SpringerLink

, , , , , ,

Leave a comment

[Abstract] Motion Rehab AVE 3D: A VR-based exergame for post-stroke rehabilitation

Abstract

Background and objective

Recent researches about games for post-stroke rehabilitation have been increasing, focusing in upper limb, lower limb and balance situations, and showing good experiences and results. With this in mind, this paper presents Motion Rehab AVE 3D, a serious game for post-stroke rehabilitation of patients with mild stroke. The aim is offer a new technology in order to assist the traditional therapy and motivate the patient to execute his/her rehabilitation program, under health professional supervision.

Methods

The game was developed with Unity game engine, supporting Kinect motion sensing input device and display devices like Smart TV 3D and Oculus Rift. It contemplates six activities considering exercises in a tridimensional space: flexion, abduction, shoulder adduction, horizontal shoulder adduction and abduction, elbow extension, wrist extension, knee flexion, and hip flexion and abduction. Motion Rehab AVE 3D also report about hits and errors to the physiotherapist evaluate the patient’s progress.

Results

A pilot study with 10 healthy participants (61–75 years old) tested one of the game levels. They experienced the 3D user interface in third-person. Our initial goal was to map a basic and comfortable setup of equipment in order to adopt later. All the participants (100%) classified the interaction process as interesting and amazing for the age, presenting a good acceptance.

Conclusions

Our evaluation showed that the game could be used as a useful tool to motivate the patients during rehabilitation sessions. Next step is to evaluate its effectiveness for stroke patients, in order to verify if the interface and game exercises contribute into the motor rehabilitation treatment progress.

Source: Motion Rehab AVE 3D: A VR-based exergame for post-stroke rehabilitation – ScienceDirect

, , , , , ,

Leave a comment

[WEB SITE] Reh@Panel (formerly RehabNet CP) – NeuroRehabLab Tools

 

Reh@Panel (formerly RehabNet CP) acts as a device router, bridging a large number of tracking devices and other hardware with the RehabNet Training Games for the patient to interact with. Reh@Panel implements the communication protocols in a client/server architecture. Native device support for:

Electrophysiological Data

 

  • Emotiv EPOC neuro-headset is intergrated for acquiring raw EEG data, gyroscope data, facial expressions and Emotiv’s Expressiv™, Cognitiv™ and Affectiv™ suite
  • Neurosky EEG headset is supported for raw EEG acquisition and eSense™ meters of attention and meditation
  • Myoelectric orthosis mPower 1000 (Myomo Inc, Boston, USA) is supported, providing 2 EMG channels and adjustable levels of assistance

 

more —> Reh@Panel (formerly RehabNet CP) | NeuroRehabLab Tools

, , , , , ,

Leave a comment

[WEB SITE] Virtual Reality and Rehabilitation — A Perfect Match?

9.2.2016 Ville Lahtinen

This is a story about VR Rehab. The story started last September in Demola, when a team of six students from different fields came together to work on a project called “Rehabilitation Using Virtual Reality”. During the course of the next four months this team explored the possibilities that VR brings to the world of rehabilitation.

Our team consisted of two coders, one business specialist, two health professionals, and me, the UX design / branding guy. The device Vincit gave us, Samsung Gear VR, proved to be a really nice headset with one small drawback: it doesn’t offer any hand tracking possibilities. And as hand movements play a pretty big role in many rehabilitation exercises, this was a challenge. But we of course like challenges.

The first target group we started designing for were hemispatial neglect patients. Neglect is a complex neurophysiological condition in which patients fail to be aware of items to one side of space. The majority of neglect patients are old people with brain injuries, and this target group quickly turned out to be a bit too challenging for us, as virtual reality is so immersive and therefore maybe a bit frightening for many older people.

After some more research we came up with the idea of a balance training game. Balance problems are common and their rehabilitation doesn’t necessarily require awareness of hand positions. The basic idea of the game was quickly formed: the patients’ objective is to focus their gaze on moving objects and keep looking at them for a certain time period. In the first version of the game these objects are fish and the background is an underwater view.

Everything in the game is meant to be fully customizable for each patient separately, though because of time-related constraints our team didn’t have the resources to fully implement everything we planned. Also with the help of a 360 degree camera the background could in theory be changed to a view that is meaningful for the patient. So if the patient for example has to spend large amounts of time indoors, this way they could visit familiar places virtually.

catch the fish

We did some user testing with our game at Pirkanmaan Erikoiskuntoutus, and the feedback we got was really positive. With some more work the game will certainly be of use for many patients. For example added levels and awards would probably make the game even more interesting and motivating.

In the coming months the first “real” VR headsets, like the Oculus Rift and HTC Vive, will be released to the consumer market. The possibilities they offer for rehabilitation purposes are much bigger, not just because of hand tracking, but also because of the added computational power. Therefore we put our thoughts in the future and also made a prototype for a platform, which would function as a collection of different games and experiences designed for rehabilitation use. This platform would store patient data so that all the progress could easily be tracked and monitored, and naturally it would work with every headset.

All in all we are pretty satisfied with the end result. Of course lots of things could have been made differently and more efficiently, but the goal was to explore the possibilities of VR, and that is what we did. Probably every team member also got new valuable experiences while working on this project. The victory of the Demola season we aimed for was frustratingly close, as we came second, but at least there were 25 teams behind us.

user testing

I’m convinced that virtual reality has lots of potential in making rehabilitation more fun and motivating. If you want to read more about the journey, please visit our blog.

Ville Lahtinen

Virtual Reality and Rehabilitation — A Perfect Match? – Ohjelmistotalo Vincit Oy

, , , , , ,

Leave a comment

[BLOG POST] Virtual And Augmented Reality: A World of Potential for People with Disabilities

Gamers are already mostly familiar with the possibilities of virtual reality and are eagerly awaiting the shipment of multiple VR titles and headsets. This gives them a leg up on medical professionals, who might dismiss it as merely a frivolous gaming tool. The truth is that this new gaming development could have wide-ranging applications in healthcare particularly in the treatment of those suffering from autism or disability.

The latest incarnations of virtual reality are a far cry from the clunky, headache-inducing units that achieved notoriety in the ’90s. The models that are expected to hit the marketplace very soon use advanced graphics capabilities and motion sensing equipment to deliver experiences that are realistic, attractive and appealing. The user usually has to put on a special headset that projects high-res images in front of the eyes. In order to control the action and change the view, he or she uses a combination of eye and head movements and hand-held devices.

Three large firms intend to release VR gear in 2016: Facebook, Sony and Microsoft. Facebook purchased the Oculus Rift, which is perhaps the frontrunner in the VR landscape, in 2014. It will work with regular computers as a plug-in peripheral. Sony’s Project Morpheus, on the other hand, is designed to act as a controller for the PlayStation 4 console. Microsoft is serious about making its mark in this newly emerging industry and has even created a special version of Windows to support its HoloLens. The HoloLens differs from most competing products by mixing holographic elements with the real world instead of just featuring a made-up world.

A VR system by MindMaze in Switzerland shows promise in treating those with motor ability impairments. The patient’s head is wired up with electrodes, and then he or she tries to manipulate a virtual arm or leg. This causes the brain to more effectively use new neurons or repair damaged ones to compensate for the injury. The CEO of MindMaze has stated that motor function can improve by up to 35 percent after three weeks of using the system.

Autistic people, who often have difficulty interacting in the real world, may find respite in an imaginary setting. A study in North Carolina found that children with autism were willing and able to enter a virtual world, observe their surroundings and move around. Another program at the University of Texas, Dallas explored the use of VR to enable autistic people to work on their social skills by partaking in virtual social interactions. The results showed that those who participated in the study had heightened brain activity in the parts of the brain responsible for social perception.

Virtual reality can extend the capabilities of people who are disabled or suffer from debilitating illnesses. Through the use of a VR system developed by FOVE, even people who have lost the use of their hands can play the piano by using eye movements and blinks to select the notes to play. Another simulation, all the way back in 1994, allowed a boy with cerebral palsy to take a virtual stroll through a grassy field.

Many of the medical applications of VR also translate well into other spheres of society. The ability to enter make-believe environments could be useful in education by allowing students to study distant or bygone places and in sports by enabling people to virtually attend contests and cheer for their teams. In architecture and home security, people could see how proposed changes to a building’s design and various security systems would actually function before doing the remodeling work. The potential uses in marketing are enormous and include advertising, product demos and virtual property tours.

Access to nearly infinite fictional, virtual worlds opens up space for treatments that would otherwise be difficult or impossible. As the technology continues to improve and prices get lower, we’ll see a growing use of VR systems to help people with a broad range of conditions, including many disabilities. This will be only a part of a broader move by society as a whole to embrace these exciting advancements.

Source: Assistive Technology Blog: Virtual And Augmented Reality: A World of Potential for People with Disabilities

, , ,

Leave a comment

[ARTICLE] An Approach to Physical Rehabilitation Using State-of-the-art Virtual Reality and Motion Tracking Technologies – Full Text PDF

Abstract

This paper explores an approach to physical rehabilitation using state-of-the-art technologies in virtual reality and motion tracking; in particular, Oculus Rift DK2 (released in July, 2014) and Intel RealSense (released in November, 2014) are used. A game is developed which requires from the patient to perform an established set of abduction and adduction arm movements to achieve rotator cuff rehabilitation after injury. While conduct of clinical trials is outside the scope of this work, experts in physical rehabilitation working in the medical field have carried out a preliminary evaluation, showing encouraging results.

Full Text PDF

Source: An Approach to Physical Rehabilitation Using State-of-the-art Virtual Reality and Motion Tracking Technologies

 

 

, , , ,

Leave a comment

%d bloggers like this: