Posts Tagged Support vector machine

[ARTICLE] Emotion Regulation Using Virtual Environments and Real-Time fMRI Neurofeedback – Full Text

Neurofeedback (NFB) enables the voluntary regulation of brain activity, with promising applications to enhance and recover emotion and cognitive processes, and their underlying neurobiology. It remains unclear whether NFB can be used to aid and sustain complex emotions, with ecological validity implications. We provide a technical proof of concept of a novel real-time functional magnetic resonance imaging (rtfMRI) NFB procedure. Using rtfMRI-NFB, we enabled participants to voluntarily enhance their own neural activity while they experienced complex emotions. The rtfMRI-NFB software (FRIEND Engine) was adapted to provide a virtual environment as brain computer interface (BCI) and musical excerpts to induce two emotions (tenderness and anguish), aided by participants’ preferred personalized strategies to maximize the intensity of these emotions. Eight participants from two experimental sites performed rtfMRI-NFB on two consecutive days in a counterbalanced design. On one day, rtfMRI-NFB was delivered to participants using a region of interest (ROI) method, while on the other day using a support vector machine (SVM) classifier. Our multimodal VR/NFB approach was technically feasible and robust as a method for real-time measurement of the neural correlates of complex emotional states and their voluntary modulation. Guided by the color changes of the virtual environment BCI during rtfMRI-NFB, participants successfully increased in real time, the activity of the septo-hypothalamic area and the amygdala during the ROI based rtfMRI-NFB, and successfully evoked distributed patterns of brain activity classified as tenderness and anguish during SVM-based rtfMRI-NFB. Offline fMRI analyses confirmed that during tenderness rtfMRI-NFB conditions, participants recruited the septo-hypothalamic area and other regions ascribed to social affiliative emotions (medial frontal / temporal pole and precuneus). During anguish rtfMRI-NFB conditions, participants recruited the amygdala and other dorsolateral prefrontal and additional regions associated with negative affect. These findings were robust and were demonstrable at the individual subject level, and were reflected in self-reported emotion intensity during rtfMRI-NFB, being observed with both ROI and SVM methods and across the two sites. Our multimodal VR/rtfMRI-NFB protocol provides an engaging tool for brain-based interventions to enhance emotional states in healthy subjects and may find applications in clinical conditions associated with anxiety, stress and impaired empathy among others.


Neurofeedback (NFB) is a novel application of brain-computer interfaces that aids real-time voluntarily regulation of brain activity. Mounting evidence shows that NFB has promising effects to enhance behavior, cognitive and emotional processes in normative samples (1–5). NFB has also been preliminary used to restore aberrant neurobiology and symptoms in neurological conditions (e.g., stroke, traumatic brain injury) and in psychopathology (e.g., ADHD, autism, depression, addiction) (1–7). Real-time functional magnetic resonance imaging (rtfMRI) based NFB has the potential to provide insight in understanding the mechanisms of psychological states (8–10). These include affiliative emotions (11) underpinned by deep brain nuclei (12, 13) the activity of which is unlikely to be robustly measured via surface electroencephalography.

rtfMRI NFB tools can be used to study the causal mechanisms of complex emotions and to inform evidence-based personalized interventions to enhance and recover aberrant emotional states (and their neural substrates) in normative and clinical samples. One key practical human challenge of fMRI studies includes participants being distracted and experiencing difficulties to feel valid psychological states in the scanner environment, particularly when trying to sustain complex emotions.

Recent studies have combined immersive virtual environments with multiple sensory modalities to deliver psychological/cognitive interventions, and to enhance their effectiveness via engaging and motivating individuals to practice (14–16).

Only two proof of concept studies have combined rt-NFB with virtual environments as brain computer interfaces (BCI). An electroencephalography-based NFB study computed brain activity from about 500 participants collectively, during an interactive game of relaxation and concentration over one night (16), where individual’s level of brain activity could not be discerned. A separate rtfMRI-NFB paradigm used a virtual fire interface to up-regulate and down-regulate brain activity in eight healthy participants—but this was devoid of any emotional states and far from multimodal and immersive (17).

It remains untested whether rt-NFB platforms integrating multisensory virtual environments can successfully recruit complex emotions and sustain these emotions long and strong enough to probe their underlying neural correlates. Such a platform can advance NFB applications, via (i) increasing the ecological validity of rtfMRI-NFB experiments, and their relevance for the daily experiences of emotions outside of experimental settings, (ii) adapting NFB interfaces to the individual and target population so these are more relatable, engaging and effective in generating and sustaining complex emotions that maximize the success of rtfMRI-NFB interventions (18–20).

This study aims to demonstrate the feasibility of an engaging rtfMRI-NFB interface that can be individually tailored and, specifically, to provide a proof of concept for a rtfMRI-NFB integrating a virtual environment as a BCI and musical stimuli using both local (region of interest, ROI) and distributed (support vector machine, SVM) analyses. The FRIEND Engine Framework system (21) was enhanced and adapted for this aim. We recruited healthy young adults performing rtfMRI-NFB during complex emotion experiences, including tenderness—a positive affiliative emotion – and anguish—a self-reflective negative emotion (11, 13, 22–25).

We also aimed to validate the functional anatomy of these complex emotions during rtfMRI-NFB. After the real-time data was collected, we ran offline fMRI data analyses to verify the effects of the real-time neurofeedback task on brain activity using standard preprocessing and statistical analysis methods.

We hypothesized that participants would voluntary change the color of a virtual environment in the BCI during rtfMRI-NFB using the activity of the following regions: (i) for the tenderness condition, the septo-hypothalamic area (when using ROI-based rtfMRI-NFB method) and other brain areas ascribed to positive affiliative emotions i.e., medial orbitofrontal areas (when using SVM-based rtfMRI-NFB method) (11, 25–27); and (ii) for the anguish condition, the amygdala (during the ROI-based fMRI-NFB method) and also lateral prefrontal cortices implicated in negative affect (e.g., anguish, fear, anxiety, negative mood, stress, psychological pain), and in psychopathologies where negative affect is a feature [e.g., depression and generalized anxiety disorder (28–32)] (during SVM-based rtfMRI-NFB).

Materials and Methods


We used a single subject, repeated measures design with two identical assessments on two consecutive days, counterbalanced by rtfMRI-NFB method (i.e., ROI and SVM). We recruited eight psychiatrically and neurologically healthy postgraduate research students, free of psychoactive medication and with normal or corrected-to-normal vision. Four participants were recruited from the D’Or Institute for Research and Education (IDOR) in Rio de Janeiro, Brazil (approved by the Ethics and Scientific committees of the Copa D’Or Hospital, Rio de Janeiro, Brazil – No 922.218). To validate the protocol in a different scanner and institution, we also recruited four participants from the Monash Biomedical Imaging (MBI) at Monash University in Melbourne, Australia (MUHREC CF15/1756 – 2015000893). All volunteers provided written informed consent prior to study participation.

Design of the Neurofeedback BCI

Supplementary video 1 and Figure 1 show the BCI used for the rt-fMRI NFB. The BCI comprised a virtual environment as a medium to convey sensory feedback to participants in real time, in association with ongoing tenderness, anguish and neutral emotional states. The virtual environment was created by editing the Unity 3D asset Autumnal Nature Pack (Unity 3D, and displayed a first-person navigation at walking speed through hills and cornfields, with a total duration of 10′8″ (Supplementary Video 1). The virtual environment was prepared to alternate between different trial types: neutral (30″), tenderness (46″) and anguish (46″).

The trial types were displayed via changes in the base color hues of the virtual environment and via specific music excerpts. Music excerpts were fixed for each trial type, and not influenced by current neural/psychological states (no music for Neutral, mild, gentle music for Tenderness and eerie, distorted music for Anguish). Music excerpts were selected from 20 audio tracks, all normalized using the root mean square feature of Audacity software (Audacity, The audio tracks were previously rated to have comparable volume, pace, and rhythm. For the rtfMRI-NFB task runs, four excerpts for tenderness and four excerpts for anguish were played.

Neutral trials were characterized by a normal colored virtual landscape displayed in the BCI with no background music. Tenderness trials were characterized by a change in the color of the virtual landscape to orange and were accompanied by tenderness music excerpts. Anguish trials commenced when the color of the environment turned to purple hues and were accompanied by anguish music excerpts.

Neurofeedback Task

Task Practice Outside the MRI

For training purposes, we recorded a video showing a sample of the virtual environment. The video lasted as long as one run of the rtfMRI-NFB task (10′ 8″) and was used by participants to practice tenderness, anguish and neutral states before the MRI. With this practice, participants could learn which music tracks and VR color changes in the BCI corresponded to tenderness, anguish and neutral trials.

Neurofeedback Interface

As shown in Figure 1, instead of a classic thermometer, the color of the virtual environment was used as BCI changed in real time with increased engagement of the neural activity/pattern corresponding to distinct target emotional states—orange for tenderness trials, purple for anguish trials and natural light tones for neutral trials. Participants were instructed to experience tenderness or anguish as intensely as possible in the respective trials and to increase the intensity of their emotion to turn in real time, the color of the virtual environment BCI to as orange as possible during tenderness trials, and as purple as possible during anguish trials, which increased in turn the corresponding neural activity/pattern.


Figure 1. Color hue modulation of the virtual environment during rtfMRI-NFB. The color hue changes from baseline neutral trials to a more intense orange and purple as participants increasingly engage target brain regions for tenderness and anguish trials.


via Frontiers | Emotion Regulation Using Virtual Environments and Real-Time fMRI Neurofeedback | Neurology

, , , , , , ,

Leave a comment

[ARTICLE] fNIRS-based Neurorobotic Interface for gait rehabilitation – Full Text



In this paper, a novel functional near-infrared spectroscopy (fNIRS)-based brain-computer interface (BCI) framework for control of prosthetic legs and rehabilitation of patients suffering from locomotive disorders is presented.


fNIRS signals are used to initiate and stop the gait cycle, while a nonlinear proportional derivative computed torque controller (PD-CTC) with gravity compensation is used to control the torques of hip and knee joints for minimization of position error. In the present study, the brain signals of walking intention and rest tasks were acquired from the left hemisphere’s primary motor cortex for nine subjects. Thereafter, for removal of motion artifacts and physiological noises, the performances of six different filters (i.e. Kalman, Wiener, Gaussian, hemodynamic response filter (hrf), Band-pass, finite impulse response) were evaluated. Then, six different features were extracted from oxygenated hemoglobin signals, and their different combinations were used for classification. Also, the classification performances of five different classifiers (i.e. k-Nearest Neighbour, quadratic discriminant analysis, linear discriminant analysis (LDA), Naïve Bayes, support vector machine (SVM)) were tested.


The classification accuracies obtained from SVM using the hrf were significantly higher (p < 0.01) than those of the other classifier/ filter combinations. Those accuracies were 77.5, 72.5, 68.3, 74.2, 73.3, 80.8, 65, 76.7, and 86.7% for the nine subjects, respectively.


The control commands generated using the classifiers initiated and stopped the gait cycle of the prosthetic leg, the knee and hip torques of which were controlled using the PD-CTC to minimize the position error. The proposed scheme can be effectively used for neurofeedback training and rehabilitation of lower-limb amputees and paralyzed patients.


Neurological disability due specifically to stroke or spinal cord injury can profoundly affect the social life of paralyzed patients [123]. The resultant gait impairment is a large contributor to ambulatory dysfunction [4]. In order to regain complete functional independence, physical rehabilitation remains the mainstay option, owing to the significant expense of health care and the redundancy of therapy sessions. Such devices are developed as alternatives to traditional, expensive and time-consuming exercises in busy daily life. In the past, similar training sessions on treadmills performed using robotic mechanisms have shown better functional outcomes [12567]. However, these devices have limitations particular to given research and clinical settings. Therefore, wearable upper- and lower-limb robotic devices have been developed [78], which are used to assist users by actuating joints to partial or complete movement using brain intentions, according to individual-patient needs.

To date, various noninvasive modalities including functional magnetic resonance imaging (fMRI), electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) have been used to acquire brain signals. fNIRS is a relatively new modality that detects brain intention with reference to changes in hemodynamic response. Its fewer artifacts, better spatial resolution and acceptable temporal resolution make it the choice for comprehensive and promising results in, for example, rehabilitation and mental task applications [91011121314151617181920]. The main brain-computer interface (BCI) challenge in this regard is to extract useful information from raw brain signals for control-command generation [212223]. Acquired signals are processed in the following four stages: preprocessing, feature extraction, classification, and command generation. In preprocessing, physiological and instrumental artifacts and noises are removed [2425]. After this filtration stage, feature extraction proceeds in order to gather useful information. Then, the extracted features are classified using different classifiers. Finally, the trained classifier is used to generate control commands based on a trained model [23]. Figure 1 shows a schematic of a BCI.

Fig. 1 Schematic of BCI


via fNIRS-based Neurorobotic Interface for gait rehabilitation | Journal of NeuroEngineering and Rehabilitation | Full Text

, , , , , , ,

Leave a comment

%d bloggers like this: