Posts Tagged avatars

[BLOG POST] Brain-Computer Interface & Virtual Avatar Offers New Hope to Patients with Gait Disabilities – Neuroscience News

Summary: Coupling a non invasive brain computer interface with a virtual walking avatar may help those with gait disorders to regain control of their movements, a new study reports. Source: University of Houston.Researchers from the University of Houston have shown for the first time that the use of a brain-computer interface augmented with a virtual walking avatar can control gait, suggesting the protocol may help patients recover the ability to walk after stroke, some spinal cord injuries and certain other gait disabilities.

Researchers said the work, done at the University’s Noninvasive Brain-Machine Interface System Laboratory, is the first to demonstrate that a brain-computer interface can promote and enhance cortical involvement during walking. The study, funded by the National Institute of Neurological Disease and Stroke, was published this week in Scientific Reports.

 

a woman

Researchers already knew electroencephalogram (EEG) readings of brain activity can distinguish whether a subject is standing still or walking. But they hadn’t previously known if a brain-computer interface was practical for helping to promote the ability to walk, or what parts of the brain are relevant to determining gait. NeuroscienceNews.com image is adapted from the U of H video.

Jose Luis Contreras-Vidal, Cullen professor of electrical and computer engineering at UH and senior author of the paper, said the data will be made available to other researchers. While similar work has been done in other primates, this is the first to involve humans, he said. Contreras-Vidal is also site director of the BRAIN Center (Building Reliable Advances and Innovation in Neurotechnology), a National Science Foundation Industry/University Cooperative Research Center.

Contreras-Vidal and researchers with his lab use non-invasive brain monitoring to determine what parts of the brain are involved in an activity, using that information to create an algorithm, or a brain-machine interface, which can translate the subject’s intentions into action.

In addition to Contreras-Vidal, researchers on the project are first author Trieu Phat Luu, a research fellow in neural engineering at UH; Sho Nakagome and Yongtian He, graduate students in the UH Department of Electrical and Computer Engineering.

“Voluntary control of movements is crucial for motor learning and physical rehabilitation,” they wrote. “Our results suggest the possible benefits of using a closed-loop EEG-based BCI-VR (brain-computer interface-virtual reality) system in inducing voluntary control of human gait.”

Researchers already knew electroencephalogram (EEG) readings of brain activity can distinguish whether a subject is standing still or walking. But they hadn’t previously known if a brain-computer interface was practical for helping to promote the ability to walk, or what parts of the brain are relevant to determining gait.

In this case, they collected data from eight healthy subjects, all of whom participated in three trials involving walking on a treadmill while watching an avatar displayed on a monitor. The volunteers were fitted with a 64-channel headset and motion sensors at the hip, knee and ankle joint.

The avatar first was activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. In later tests, the avatar was controlled by the brain-computer interface, meaning the subject controlled the avatar with his or her brain.

The avatar perfectly mimicked the subject’s movements when relying upon the sensors, but the match was less precise when the brain-computer interface was used.

Contreras-Vidal said that’s to be expected, noting that other studies have shown some initial decoding errors as the subject learns to use the interface. “It’s like learning to use a new tool or sport,” he said. “You have to understand how the tool works. The brain needs time to learn that.”

The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex, which is involved in motor learning and error monitoring.

The next step is to use the protocol with patients, the subject of He’s Ph.D. dissertation.

“The appeal of brain-machine interface is that it places the user at the center of the therapy,” Contreras-Vidal said. “They have to be engaged, because they are in control.”

Source: Brain-Computer Interface & Virtual Avatar Offers New Hope to Patients with Gait Disabilities – Neuroscience News

 

, , , , , , , , , , ,

Leave a comment

[WEB SITE] Gaming helps personalized therapy level up – Penn State University

UNIVERSITY PARK, Pa. — Using game features in non-game contexts, computers can learn to build personalized mental- and physical-therapy programs that enhance individual motivation, according to Penn State engineers.

“We want to understand the human and team behaviors that motivate learning to ultimately develop personalized methods of learning instead of the one-size-fits-all approach that is often taken,” said Conrad Tucker, assistant professor of engineering design and industrial engineering.

They seek to use machine learning to train computers to develop personalized mental or physical therapy regimens — for example, to overcome anxiety or recover from a shoulder injury — so many individuals can each use a tailor-made program.

“Using people to individually evaluate others is not efficient or sustainable in time or human resources and does not scale up well to large numbers of people,” said Tucker. “We need to train computers to read individual people. Gamification explores the idea that different people are motivated by different things.”

To begin creating computer models for therapy programs, the researchers tested how to most effectively make the completion of a physical task into a gamified application by incorporating game features like scoring, avatars, challenges and competition.

“We’re exploring here how gamification could be applied to health and wellness by focusing on physically interactive gamified applications,” said Christian Lopez, graduate student in industrial engineering, who helped conduct the tests using a virtual-reality game environment.

Screen from game designed to test features for gamification use in physical and mental therapy. Image: Kimberly Cartier / Penn State

In the virtual-reality tests, researchers asked participants to physically avoid obstacles as they moved through a virtual environment. The game system recorded their actual body positions using motion sensors and then mirrored their movements with an avatar in virtual reality.

Participants had to bend, crouch, raise their arms, and jump to avoid obstacles. The participant successfully avoided a virtual obstacle if no part of their avatar touched the obstacle. If they made contact, the researchers rated the severity of the mistake by how much of the avatar touched the obstacle.

In one of the application designs, participants could earn more points by moving to collect virtual coins, which sometimes made them hit an obstacle.

“As task complexity increases, participants need more motivation to achieve the same level of results,” said Lopez. “No matter how engaging a particular feature is, it needs to move the participant towards completing the objective rather than backtracking or wasting time on a tangential task. Adding more features doesn’t necessarily enhance performance.”

Tucker and Lopez created a predictive algorithm — a mathematical formula to forecast the outcome of an event — that rates the potential usefulness of a game feature. They then tested how well each game feature motivated participants when completing the virtual-reality tasks. They compared their test results to the algorithm’s predictions as a proof of concept and found that the formula correctly anticipated which game features best motivated people in the physically interactive tasks.

The researchers found that gamified applications with a scoring system, the ability to select an avatar, and in-game rewards led to significantly fewer mistakes and higher performance than those with a win-or-lose system, randomized gaming backgrounds and performance-based awards.

Sixty-eight participants tested two designs that differed only by the features used to complete the same set of tasks. Tucker and Lopez published their results in Computers in Human Behavior.

The researchers chose the tested game features from the top-ranked games in the Google Play app store, taking advantage of the features that make the games binge-worthy and re-playable, and then narrowed the selection based on available technology.

Their algorithm next ranked game features by how easily designers could implement them, the physical complexity of using the feature, and the impact of the feature on participant motivation and ability to complete the task. If a game feature is too technologically difficult to incorporate into the game, too physically complex, does not offer enough incentive for added effort or works against the end goal of the game, then the feature has low potential usefulness.

The researchers would also like to use these results to boost workplace performance and personalize virtual-reality classrooms for online education.

“Game culture has already explored and mastered the psychological aspects of games that make them engaging and motivating,” said Tucker. “We want to leverage that knowledge towards the goal of individualized optimization of workplace performance.”

To do this, Tucker and Lopez next want to connect performance with mental state during these gamified physical tasks. Heart rate, electroencephalogram signals and facial expressions will be used as proxies for mood and mental state while completing tasks to connect mood with game features that affect motivation.

The National Science Foundation funded this research.

Source: Gaming helps personalized therapy level up | Penn State University

, , ,

Leave a comment

%d bloggers like this: