[ARTICLE] MEG-based neurofeedback for hand rehabilitation – Full text HTML

Abstract

Background: Providing neurofeedback (NF) of motor-related brain activity in a biologically-relevant and intuitive way could maximize the utility of a brain-computer interface (BCI) for promoting therapeutic plasticity. We present a BCI capable of providing intuitive and direct control of a video-based grasp.

Methods: Utilizing magnetoencephalography’s (MEG) high temporal and spatial resolution, we recorded sensorimotor rhythms (SMR) that were modulated by grasp or rest intentions. SMR modulation controlled the grasp aperture of a stop motion video of a human hand. The displayed hand grasp position was driven incrementally towards a closed or opened state and subjects were required to hold the targeted position for a time that was adjusted to change the task difficulty.

Results: We demonstrated that three individuals with complete hand paralysis due to spinal cord injury (SCI) were able to maintain brain-control of closing and opening a virtual hand with an average of 63 % success which was significantly above the average chance rate of 19 %. This level of performance was achieved without pre-training and less than 4 min of calibration. In addition, successful grasp targets were reached in 1.96 ± 0.15 s. Subjects performed 200 brain-controlled trials in approximately 30 min excluding breaks. Two of the three participants showed a significant improvement in SMR indicating that they had learned to change their brain activity within a single session of NF.

Conclusions: This study demonstrated the utility of a MEG-based BCI system to provide realistic, efficient, and focused NF to individuals with paralysis with the goal of using NF to induce neuroplasticity.

Continue —> JNER | Full text | MEG-based neurofeedback for hand rehabilitation

 

Fig. 1. Schematic of the BCI used to translate SMR into proportional control of grasping. Beginning in the upper left, first, the power spectrum of data recorded from 36 sensorimotor MEG sensors (shown on a top-down view of the MEG helmet) are computed using 300 ms sliding windows. A mask is applied to these features to remove any components that did not exhibit desynchronization during calibration. Then a linear decoder applies weights (W) to the neural signal (N) to compute a hand velocity value (VH ). The velocity output from the decoder is scaled (g) to ensure movement speeds are appropriate for the task. The previous hand position (an image from the video sequence) is then updated more closed or more opened within the ROM based on the scaled velocity command. The picture representing the desired aperture is chosen from 25 possible images. A progressive change in the images appeared to participants as a grasping movie with a 76 ms refresh rate.
Foldes et al. Journal of NeuroEngineering and Rehabilitation 2015 12:85   doi:10.1186/s12984-015-0076-7
Download authors’ original image

 

, , , , , , , , ,

  1. Leave a comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: