Posts Tagged virtual environments
[Abstract] User Experience Evaluation of an Interactive Virtual Reality-Based System for Upper Limb Rehabilitation – IEEE Conference Publication
The main goal of this project is to refine and optimize elements of the virtual reality-based training paradigms to enhance neuroplasticity and maximize recovery of function in the hemiplegic hand of patients who had a stroke.
PIs, Sergei Adamovich, Alma Merians, Eugene Tunik, A.M. Barrett
This application seeks funding to continue our on-going investigation into the effects of intensive, high dosage task and impairment based training of the hemiparetic hand, using haptic robots integrated with complex gaming and virtual reality simulations. A growing body of work suggests that there is a time-limited period of post-ischemic heightened neuronal plasticity during which intensive training may optimally affect the recovery of gross motor skills, indicating that the timing of rehabilitation is as important as the dosing. However, recent literature indicates a controversy regarding both the value of intensive, high dosage as well as the optimal timing for therapy in the first two months after stroke. Our study is designed to empirically investigate this controversy. Furthermore, current service delivery models in the United States limit treatment time and length of hospital stay during this period. In order to facilitate timely discharge from the acute care hospital or the acute rehabilitation setting, the initial priority for rehabilitation is independence in transfers and ambulation. This has negatively impacted the provision of intensive hand and upper extremity therapy during this period of heightened neuroplasticity. It is evident that providing additional, intensive therapy during the acute rehabilitation stay is more complicated to implement and difficult for patients to tolerate, than initiating it in the outpatient setting, immediately after discharge. Our pilot data show that we are able to integrate intensive, targeted hand therapy into the routine of an acute rehabilitation setting. Our system has been specifically designed to deliver hand training when motion and strength are limited. The system uses adaptive algorithms to drive individual finger movement, gain adaptation and workspace modification to increase finger range of motion, and haptic and visual feedback from mirrored movements to reinforce motor networks in the lesioned hemisphere. We will translate the extensive experience gained in our previous studies on patients in the chronic phase, to investigate the effects of this type of intervention on recovery and function of the hand, when the training is initiated within early period of heightened plasticity. We will integrate the behavioral, the kinematic/kinetic and neurophysiological aspects of recovery to determine: 1) whether early intensive training focusing on the hand will result in a more functional hemiparetic arm; (2) whether it is necessary to initiate intensive hand therapy during the very early inpatient rehabilitation phase or will comparable outcomes be achieved if the therapy is initiated right after discharge, in the outpatient period; and 3) whether the effect of the early intervention observed at 6 months post stroke can be predicted by the cortical reorganization evaluated immediately after the therapy. This proposal will fill a critical gap in the literature and make a significant advancement in the investigation of putative interventions for recovery of hand function in patients post-stroke. Currently relatively little is known about the effect of very intensive, progressive VR/robotics training in the acute early period (5-30 days) post-stroke. This proposal can move us past a critical barrier to the development of more effective approaches in stroke rehabilitation targeted at the hand and arm.
[ARTICLE] Emotion Regulation Using Virtual Environments and Real-Time fMRI Neurofeedback – Full Text
Neurofeedback (NFB) enables the voluntary regulation of brain activity, with promising applications to enhance and recover emotion and cognitive processes, and their underlying neurobiology. It remains unclear whether NFB can be used to aid and sustain complex emotions, with ecological validity implications. We provide a technical proof of concept of a novel real-time functional magnetic resonance imaging (rtfMRI) NFB procedure. Using rtfMRI-NFB, we enabled participants to voluntarily enhance their own neural activity while they experienced complex emotions. The rtfMRI-NFB software (FRIEND Engine) was adapted to provide a virtual environment as brain computer interface (BCI) and musical excerpts to induce two emotions (tenderness and anguish), aided by participants’ preferred personalized strategies to maximize the intensity of these emotions. Eight participants from two experimental sites performed rtfMRI-NFB on two consecutive days in a counterbalanced design. On one day, rtfMRI-NFB was delivered to participants using a region of interest (ROI) method, while on the other day using a support vector machine (SVM) classifier. Our multimodal VR/NFB approach was technically feasible and robust as a method for real-time measurement of the neural correlates of complex emotional states and their voluntary modulation. Guided by the color changes of the virtual environment BCI during rtfMRI-NFB, participants successfully increased in real time, the activity of the septo-hypothalamic area and the amygdala during the ROI based rtfMRI-NFB, and successfully evoked distributed patterns of brain activity classified as tenderness and anguish during SVM-based rtfMRI-NFB. Offline fMRI analyses confirmed that during tenderness rtfMRI-NFB conditions, participants recruited the septo-hypothalamic area and other regions ascribed to social affiliative emotions (medial frontal / temporal pole and precuneus). During anguish rtfMRI-NFB conditions, participants recruited the amygdala and other dorsolateral prefrontal and additional regions associated with negative affect. These findings were robust and were demonstrable at the individual subject level, and were reflected in self-reported emotion intensity during rtfMRI-NFB, being observed with both ROI and SVM methods and across the two sites. Our multimodal VR/rtfMRI-NFB protocol provides an engaging tool for brain-based interventions to enhance emotional states in healthy subjects and may find applications in clinical conditions associated with anxiety, stress and impaired empathy among others.
Neurofeedback (NFB) is a novel application of brain-computer interfaces that aids real-time voluntarily regulation of brain activity. Mounting evidence shows that NFB has promising effects to enhance behavior, cognitive and emotional processes in normative samples (1–5). NFB has also been preliminary used to restore aberrant neurobiology and symptoms in neurological conditions (e.g., stroke, traumatic brain injury) and in psychopathology (e.g., ADHD, autism, depression, addiction) (1–7). Real-time functional magnetic resonance imaging (rtfMRI) based NFB has the potential to provide insight in understanding the mechanisms of psychological states (8–10). These include affiliative emotions (11) underpinned by deep brain nuclei (12, 13) the activity of which is unlikely to be robustly measured via surface electroencephalography.
rtfMRI NFB tools can be used to study the causal mechanisms of complex emotions and to inform evidence-based personalized interventions to enhance and recover aberrant emotional states (and their neural substrates) in normative and clinical samples. One key practical human challenge of fMRI studies includes participants being distracted and experiencing difficulties to feel valid psychological states in the scanner environment, particularly when trying to sustain complex emotions.
Recent studies have combined immersive virtual environments with multiple sensory modalities to deliver psychological/cognitive interventions, and to enhance their effectiveness via engaging and motivating individuals to practice (14–16).
Only two proof of concept studies have combined rt-NFB with virtual environments as brain computer interfaces (BCI). An electroencephalography-based NFB study computed brain activity from about 500 participants collectively, during an interactive game of relaxation and concentration over one night (16), where individual’s level of brain activity could not be discerned. A separate rtfMRI-NFB paradigm used a virtual fire interface to up-regulate and down-regulate brain activity in eight healthy participants—but this was devoid of any emotional states and far from multimodal and immersive (17).
It remains untested whether rt-NFB platforms integrating multisensory virtual environments can successfully recruit complex emotions and sustain these emotions long and strong enough to probe their underlying neural correlates. Such a platform can advance NFB applications, via (i) increasing the ecological validity of rtfMRI-NFB experiments, and their relevance for the daily experiences of emotions outside of experimental settings, (ii) adapting NFB interfaces to the individual and target population so these are more relatable, engaging and effective in generating and sustaining complex emotions that maximize the success of rtfMRI-NFB interventions (18–20).
This study aims to demonstrate the feasibility of an engaging rtfMRI-NFB interface that can be individually tailored and, specifically, to provide a proof of concept for a rtfMRI-NFB integrating a virtual environment as a BCI and musical stimuli using both local (region of interest, ROI) and distributed (support vector machine, SVM) analyses. The FRIEND Engine Framework system (21) was enhanced and adapted for this aim. We recruited healthy young adults performing rtfMRI-NFB during complex emotion experiences, including tenderness—a positive affiliative emotion – and anguish—a self-reflective negative emotion (11, 13, 22–25).
We also aimed to validate the functional anatomy of these complex emotions during rtfMRI-NFB. After the real-time data was collected, we ran offline fMRI data analyses to verify the effects of the real-time neurofeedback task on brain activity using standard preprocessing and statistical analysis methods.
We hypothesized that participants would voluntary change the color of a virtual environment in the BCI during rtfMRI-NFB using the activity of the following regions: (i) for the tenderness condition, the septo-hypothalamic area (when using ROI-based rtfMRI-NFB method) and other brain areas ascribed to positive affiliative emotions i.e., medial orbitofrontal areas (when using SVM-based rtfMRI-NFB method) (11, 25–27); and (ii) for the anguish condition, the amygdala (during the ROI-based fMRI-NFB method) and also lateral prefrontal cortices implicated in negative affect (e.g., anguish, fear, anxiety, negative mood, stress, psychological pain), and in psychopathologies where negative affect is a feature [e.g., depression and generalized anxiety disorder (28–32)] (during SVM-based rtfMRI-NFB).
Materials and Methods
We used a single subject, repeated measures design with two identical assessments on two consecutive days, counterbalanced by rtfMRI-NFB method (i.e., ROI and SVM). We recruited eight psychiatrically and neurologically healthy postgraduate research students, free of psychoactive medication and with normal or corrected-to-normal vision. Four participants were recruited from the D’Or Institute for Research and Education (IDOR) in Rio de Janeiro, Brazil (approved by the Ethics and Scientific committees of the Copa D’Or Hospital, Rio de Janeiro, Brazil – No 922.218). To validate the protocol in a different scanner and institution, we also recruited four participants from the Monash Biomedical Imaging (MBI) at Monash University in Melbourne, Australia (MUHREC CF15/1756 – 2015000893). All volunteers provided written informed consent prior to study participation.
Design of the Neurofeedback BCI
Supplementary video 1 and Figure 1 show the BCI used for the rt-fMRI NFB. The BCI comprised a virtual environment as a medium to convey sensory feedback to participants in real time, in association with ongoing tenderness, anguish and neutral emotional states. The virtual environment was created by editing the Unity 3D asset Autumnal Nature Pack (Unity 3D, https://assetstore.unity.com/packages/3d/environments/autumnal-nature-pack-3649) and displayed a first-person navigation at walking speed through hills and cornfields, with a total duration of 10′8″ (Supplementary Video 1). The virtual environment was prepared to alternate between different trial types: neutral (30″), tenderness (46″) and anguish (46″).
The trial types were displayed via changes in the base color hues of the virtual environment and via specific music excerpts. Music excerpts were fixed for each trial type, and not influenced by current neural/psychological states (no music for Neutral, mild, gentle music for Tenderness and eerie, distorted music for Anguish). Music excerpts were selected from 20 audio tracks, all normalized using the root mean square feature of Audacity software (Audacity, http://www.audacityteam.org). The audio tracks were previously rated to have comparable volume, pace, and rhythm. For the rtfMRI-NFB task runs, four excerpts for tenderness and four excerpts for anguish were played.
Neutral trials were characterized by a normal colored virtual landscape displayed in the BCI with no background music. Tenderness trials were characterized by a change in the color of the virtual landscape to orange and were accompanied by tenderness music excerpts. Anguish trials commenced when the color of the environment turned to purple hues and were accompanied by anguish music excerpts.
Task Practice Outside the MRI
For training purposes, we recorded a video showing a sample of the virtual environment. The video lasted as long as one run of the rtfMRI-NFB task (10′ 8″) and was used by participants to practice tenderness, anguish and neutral states before the MRI. With this practice, participants could learn which music tracks and VR color changes in the BCI corresponded to tenderness, anguish and neutral trials.
As shown in Figure 1, instead of a classic thermometer, the color of the virtual environment was used as BCI changed in real time with increased engagement of the neural activity/pattern corresponding to distinct target emotional states—orange for tenderness trials, purple for anguish trials and natural light tones for neutral trials. Participants were instructed to experience tenderness or anguish as intensely as possible in the respective trials and to increase the intensity of their emotion to turn in real time, the color of the virtual environment BCI to as orange as possible during tenderness trials, and as purple as possible during anguish trials, which increased in turn the corresponding neural activity/pattern.
[Abstract] Robot-assisted arm training in physical and virtual environments: A case study of long-term chronic stroke
Development and testing of virtual environments for rehabilitation is a lengthy process which involves conceptualization, design, validation, proof concept testing and ultimately, if appropriate, randomized controlled trials. Ironically, once vetted, many of these VEs are not available to clinicians or their patients. To address the challenge of transferring research grade technology from the lab to the clinic the authors have created the Open Rehabilitation Initiative. It is an international independent online portal that aims to help clinicians, scientists, engineers, game developers and end-users to interact with and share virtual rehabilitation tools. In this paper, the conceptualization, development and formative evaluation testing are described. Three groups of developers of VEs (n=3), roboticists who use VEs for robot interactivity (n=10) and physical therapists (n=6) who are the clinicians end-users participated in the study. Interviews, focus groups and administration of the System Usability Scale (SUS) were used to assess acceptability. Data were collected on three aspects: 1) discussion of what a resource might look like; 2) interaction with the site; and 3) reaction to the proposed site and completion of the SUS. Interviews and focus groups were recorded and transcribed. Data from the SUS was analyzed using a One-way ANOVA. There was no significant difference by groups. However, the clinicians’ mean score of 68 on the SUS was just at the acceptable level, while the developers and roboticists scored above 80. While all users agreed that the site was a tool that could promote collaboration and interaction between developers and users, each had different requirements for the design and use. Iterative development and discussion of scaling and sustaining the site is ongoing.
Development testing of virtual environments (VEs) for rehabilitation is a lengthy process, which involves conceptualization, design, validation, proof concept testing and ultimately, if appropriate, randomized controlled trials. Often, once the technology has been developed and tested for a specific application, it is discarded. This results in a lack of transfer of the technology to the clinician at the point of care. Several explanations exist for the lack of transfer. One explanation is that the technology was not developed with the end-user in mind and therefore uses hardware that may not be readily available to persons in clinical practice. Another explanation is that the route to commercialization is expensive and lengthy and many scientists are not interested in pursuing this avenue. In this paper we propose the development of an international community as a solution, the Open Rehab Initiative (ORI), whereby developers may share their technology with clinicians. As the virtual rehabilitation field evolves and technology becomes more accessible and available, the authors believe, it is increasingly important to find mechanisms to coordinate and bring together clinicians, scientists and engineers to interact with and share their efforts with virtual rehabilitation tools. Recent reviews support the use of virtual rehabilitation training in people with neurological diagnoses (Pietrzak et al., 2014; Laver et al., 2015). However, a large part of the implementation of VR in rehabilitation is limited to work developed in the context of research projects – which does not reach end users, in particular clinicians and patients. Currently what is available to clinicians are the results of efforts to repurpose commercial games available for game consoles by providing clinicians with tools to adapt the Wii™ (Deutsch et al., 2011) and online resources on how to use the Adventure Games for the Kinect™ (Levac et al., 2015). Resources are also made available by clinicians or researchers themselves through blogs or structured websites where hardware and software lists – mostly commercial Wii™ or Kinect™ games, and sometimes companies developing bespoke rehabilitation systems – are shared together with therapy game suggestions, ratings and tips (Leynse Harpold, 2016; Scott, 2016; TherapWii, 2016). In implementation sciences, researchers have studied transfer knowledge as well as support of clinical reasoning by using online resources (Deutsch et al., 2015). The use of these resources for knowledge translation has been associated with positive behavior change in healthcare workers including nurses, physicians, physical therapists, and occupational therapists (Magrabi et al., 2004; McKenna et al., 2005; Grimshaw et al., 2006; Honeybourne et al., 2006). Unfortunately, less work can be found in the systematic transfer of virtual environments and serious games technology from developers to users. Multiple efforts exist in the creation of indices of games specially designed for specific health purposes (Lieberman et al., 2013; Serious Games Association, 2016). Unfortunately, available indices do not refer to literature specific to the applications and, consequently, the use of such games and applications is mostly not validated. On the other hand, other initiatives such as Games for Health and Games for Health Europe (van Rijswijk et al., 2016) feature a limited number of research projects with detailed descriptions and information regarding their target population and scientific outcome. However, in most cases content is unavailable to the clinician and end user. Consequently, there is still a large body of work on validated and research-driven technology that remains unavailable to clinicians and patient populations. There is therefore the need to facilitate the translation from research into daily clinical practice and to create new communication channels and a common framework to share and improve interventions in this area. The ORI is an international independent initiative that aims to help clinicians, scientists, engineers, game developers and end-users to interact with and share virtual rehabilitation tools. The ORI portal is planned as a hub where the community who build and use software tools for virtual rehabilitation can easily communicate, interact with and share these tools. The webpage currently offers software, drivers, and documentation of evidence and application, with support for discussion boards, and blogs. Although ORI originates from academic institutions, it is designed to grow through community driven content, incorporating inputs from all the relevant communities. This sentiment is reflected in the ORI mission statement: ORI’s mission is to become “the go-to community for clinicians, scientists, engineers, game developers and end-users to interact with and share virtual rehabilitation tools”. As such, we aim to attract both developers and virtual rehabilitation users, for research as well as for clinical practice. The scope of the simulations encompasses sensorimotor and cognitive rehabilitation. The objective of this study was twofold: first, to describe the conceptualization and preliminary rendering of the ORI site; and second, to report on the formative evaluations conducted on three groups of users: clinicians in a rehabilitation setting, developers (clinician scientists and engineers) of VEs for rehabilitation and an engineering group that develops robotic devices that have serious game interfaces. As developers and roboticits have a certain degree of technical expertise and would be both contributors and users to the site, we anticipated that their assessment of the site capability as well as the usability ratings would differ from those of the clinicians.