By adding electronics and computation technology to a simple cane that has been around since ancient times, a team of researchers at Columbia Engineering have transformed it into a 21st century robotic device that can provide light-touch assistance in walking to the aged and others with impaired mobility.
A team led by Sunil Agrawal, professor of mechanical engineering and of rehabilitation and regenerative medicine at Columbia Engineering, has demonstrated, for the first time, the benefit of using an autonomous robot that “walks” alongside a person to provide light-touch support, much as one might lightly touch a companion’s arm or sleeve to maintain balance while walking. Their study is published today in the IEEE Robotics and Automation Letters.
“Often, elderly people benefit from light hand-holding for support,” explained Agrawal, who is also a member of Columbia University’s Data Science Institute. “We have developed a robotic cane attached to a mobile robot that automatically tracks a walking person and moves alongside,” he continued. “The subjects walk on a mat instrumented with sensors while the mat records step length and walking rhythm, essentially the space and time parameters of walking, so that we can analyze a person’s gait and the effects of light touch on it.”
The light-touch robotic cane, called CANINE, acts as a cane-like mobile assistant. The device improves the individual’s proprioception, or self-awareness in space, during walking, which in turn improves stability and balance.
“This is a novel approach to providing assistance and feedback for individuals as they navigate their environment,” said Joel Stein, Simon Baruch Professor of Physical Medicine and Rehabilitation and chair of the department of rehabilitation and regenerative medicine at Columbia University Irving Medical Center, who co-authored the study with Agrawal. “This strategy has potential applications for a variety of conditions, especially individuals with gait disorders.”
To test this new device, the team fitted 12 healthy young people with virtual reality glasses that created a visual environment that shakes around the user – both side-to-side and forward-backward – to unbalance their walking gait. The subjects each walked 10 laps on the instrumented mat, both with and without the robotic cane, in conditions that tested walking with these visual perturbations. In all virtual environments, having the light-touch support of the robotic cane caused all subjects to narrow their strides. The narrower strides, which represent a decrease in the base of support and a smaller oscillation of the center of mass, indicate an increase in gait stability due to the light-touch contact.
“The next phase in our research will be to test this device on elderly individuals and those with balance and gait deficits to study how the robotic cane can improve their gait,” said Agrawal, who directs the Robotics and Rehabilitation (ROAR) Laboratory. “In addition, we will conduct new experiments with healthy individuals, where we will perturb their head-neck motion in addition to their vision to simulate vestibular deficits in people.”
While mobility impairments affect 4% of people aged 18 to 49, this number rises to 35% of those aged 75 to 80 years, diminishing self-sufficiency, independence, and quality of life. By 2050, it is estimated that there will be only five young people for every old person, as compared with seven or eight today.
“We will need other avenues of support for an aging population,” Agrawal noted. “This is one technology that has the potential to fill the gap in care fairly inexpensively.”
In this short episode, Dr. Patrick discusses some of the compelling science including observational studies, randomized controlled trials, and human mechanistic studies that suggests exercise is a powerful tool for preventing or managing the symptoms of depression and mental illness. Moreover, she talks about the specific types of exercise and exercise parameters that evidence suggests might be the most helpful for depression.
▶︎ Did you enjoy this podcast? It was brought to you by people like you! Visit our crowdsponsor page where you can learn more about how to support the podcast and access a growing number of premium members benefits. https://www.foundmyfitness.com/crowds…
DISCLAIMER: This video is not meant to be a substitute for expert diagnosis or treatment of clinical conditions.
Visual Deficits:Now You See It, Now You Don’t- A Clinical Pearl by Diane Powers Dirette, PhD, OTL
Visual deficits match many diagnoses and, if undetected, can be mistaken for other problems – e.g. sensory, motor, balance and cognitive deficits. It’s critical, therefore, that therapists know how to complete a basic visual screening and to interpret the results. For example, how can you tell homonymous hemianopia apart from unilateral inattention? The screening tools are virtually the same, but the screening results differ subtly.
Google Live Transcribe is an accessibility tool meant to make life easier for those who are deaf or hard of hearing. It automatically turns any speech into text while the person is still speaking. It’s fast enough to be used in conversations.
The text can be a black font on a white background or a white font on a black background. The top-right corner indicates whether the environment is noisy, which means people have to speak louder to be heard. And if someone speaks to you from behind, the phone vibrates to let you know. Try it out, it works surprisingly smoothly.
The app uses Google’s Cloud Speech API, so it requires an active internet connection. Google says it doesn’t store any audio on its servers, but we’d take such proclamations with a pinch of salt. Google already knows a lot about you, and they do share data with authorities.
Download: Google Live Transcribe for Android (Free)