Flint Rehab announces the launch of MiGo, a wearable activity tracker specifically designed for stroke survivors. The device makes its official debut at the 2019 Consumer Electronics Show in Las Vegas.
MiGo is designed to track upper extremity activity — in addition to walking — and is optimized for the movement patterns performed by individuals with stroke. The device is accompanied by a smartphone app that provides motivational support through digital coaching, progressive goal setting, and social networking with other stroke survivors, according to the company in a media release.
“Most wearable fitness trackers are designed to help people get into shape. MiGo is a new type of wearable that helps people regain their independence after a stroke,” says Dr Nizan Friedman, co-founder and CEO of Irvine, Calif-based Flint Rehab, in the release.
“Traditionally, innovation in medical technology has been limited by what insurance companies are willing to cover. As a consumer-level digital health technology, MiGo avoids these constraints, empowering stroke survivors to take their recovery into their own hands.”
A common outcome of stroke is hemiparesis, or impaired movement on one side of the body. One of the leading causes of this lifelong disability is a phenomenon called “learned non-use,” where stroke survivors neglect to use their impaired arm or leg, causing their brain to lose the ability to control those limbs altogether.
MiGo directly addresses the problem of learned non-use by motivating stroke survivors to use their impaired side as much as possible. Using deep-learning algorithms, MiGo accurately tracks how much the wearer is using their impaired side, providing them with an easy-to-understand rep count throughout the day.
MiGo also provides an intelligent activity goal that updates every day based on the wearer’s actual movement ability, ensuring every user stays continuously challenged at the level appropriate for them. Then, the device acts as the wearer’s personal cheerleader, giving them rewards and positive feedback right on their wrist as they work to hit their daily goal, the release explains.
“Suffering a stroke is a traumatic, life-changing event. Many survivors do not have the proper support network to deal with the event, and they may find it difficult to relate with friends and family who don’t understand what they are going through,” states Dan Zondervan, co-founder and vice president of Flint Rehab.
“Using the MiGo app, users can join groups to share their activity data and collaborate with other stroke survivors to achieve group goals. Group members can also share their experiences and offer encouraging support to each other — right in the app,” he adds.
Brain-Computer Interface (BCI) combined with assistive robots has been developed as a promising method for stroke rehabilitation. However, most of the current studies are based on complex system setup, expensive and bulky devices. In this work, we designed a wearable Electroencephalography(EEG)-based BCI system for hand function rehabilitation of the stroke. The system consists of a customized EEG cap, a small-sized commercial amplifer and a lightweight hand exoskeleton. In addition, visualized interface was designed for easy use. Six healthy subjects and two stroke patients were recruited to validate the safety and effectiveness of our proposed system. Up to 79.38% averaged online BCI classification accuracy was achieved. This study is a proof of concept, suggesting potential clinical applications in outpatient environments.
1. G. Pfurtscheller and C. Neuper , “Motor imagery and direct brain-computer communication”, Proceedings of the IEEE, vol. 89, no. 7, pp. 1123-1134, 2001.
2. E. Donchin , K. Spencer and R. Wijesinghe , “The mental prosthesis: assessing the speed of a P300-based brain-computer interface”, IEEE Transactions on Rehabilitation Engineering, vol. 8, no. 2, pp. 174-179, 2000.
4. Xiaorong Gao , Dingfeng Xu , Ming Cheng and Shangkai Gao , “A bci-based environmental controller for the motion-disabled”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 11, no. 2, pp. 137-140, 2003.
7. M. A. Cervera , S. R. Soekadar , J. Ushiba et al “Brain-computer interfaces for post-stroke motor rehabilitation: a meta-analysis”, Annals of Clinical and Translational Neurology, vol. 5, no. 5, pp. 651-663, 2018.
8. K. Ang , K. Chua , K. Phua et al “A Randomized Controlled Trial of EEG-Based Motor Imagery Brain-Computer Interface Robotic Rehabilitation for Stroke”, Clinical EEG and Neuroscience, vol. 46, no. 4, pp. 310-320, 2014.
9. N. Bhagat , A. Venkatakrishnan , B. Abibullaev et al “Design and Optimization of an EEG-Based Brain Machine Interface (BMI) to an Upper-Limb Exoskeleton for Stroke Survivors”, Frontiers in Neuroscience, vol. 10, pp. 122, 2016.
10. J. Webb , Z. G. Xiao , K. P. Aschenbrenner , G. Herrnstadt , and C. Menon , “Towards a portable assistive arm exoskeleton for stroke patient rehabilitation controlled through a brain computer interface”, in Biomedical Robotics and Biomechatronics (BioRob), 2012 4th IEEE RAS & EMBS International Conference, pp. 1299-1304, 2012.
11. A. L. Coffey , D. J. Leamy , and T. E. Ward , “A novel BCI-controlled pneumatic glove system for home-based neurorehabilitation”, in Engineering in Medicine and Biology Society (EMBC), 2014 36th Annual International Conference of the IEEE, pp. 3622-3625, 2014.
12. D. Bundy , L. Souders , K. Baranyai et al “Contralesional Brain-Computer Interface Control of a Powered Exoskeleton for Motor Recovery in Chronic Stroke Survivors”, Stroke, vol. 48, no. 7, pp. 1908-1915, 2017.
13. X. Shu , S. Chen , L. Yao et al “Fast Recognition of BCI-Inefficient Users Using Physiological Features from EEG Signals: A Screening Study of Stroke Patients”, Frontiers in Neuroscience, vol. 12, pp. 93, 2018.
15. G. Schalk , D. McFarland , T. Hinterberger , N. Birbaumer and J. Wolpaw , “BCI2000: A General-Purpose Brain-Computer Interface (BCI) System”, IEEE Transactions on Biomedical Engineering, vol. 51, no. 6, pp. 1034-1043, 2004.
16. M. H. B. Azhar , A. Casey , and M. Sakel , “A cost-effective BCI assisted technology framework for neurorehabilitation”, The Seventh International Conference on Global Health Challenges, 18th-22nd November, 2018. (In Press)
17. C. M. McCrimmon , M. Wang , L. S. Lopes et al “A small, portable, battery-powered brain-computer interface system for motor rehabilitation”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 2776-2779, 2016.
18. J. Meng , B. Edelman , J. Olsoe et al “A Study of the Effects of Electrode Number and Decoding Algorithm on Online EEG-Based BCI Behavioral Performance”, Frontiers in Neuroscience, vol. 12, pp. 227, 2018.
Many were shocked when “Games of Thrones” actress Emily Clarke recently disclosed in The New Yorker that she suffered from two brain aneurysms in 2011 and 2013, respectively. Owing to her bravery and resilient spirit, her recovery was quick enough to resume normal work life within weeks of the operation.
Clarke, however, belongs to the exclusive 10 percent of survivors who recover completely after experiencing a stroke. On the other hand, 40 percent of stroke patients usually have moderate to severe impairments according to statistics published by Healthline.
Strokes are common in the U.S. as 1 in 19 deaths are caused by this. American Stroke Association’s datarevealed 795,000 Americans suffer from strokes every year. Of which, 185,000 experience recurrent attacks as reintegrating into society with motor skills intact poses a serious challenge.
Strokes are a leading cause of disability in the U.S. One of the main disabilities stroke survivors deal with is known as apraxia, which is a neurological disorder that prevents swift body movements. Other disabilities include the one-sided paralysis of the body known as hemiplegiaand dysphagia, which refers to damage to part of the brain controlling swallowing.
Generally, rehabilitation and therapy help, but the field requires more research and innovation. There are a few scientists trying to come up with more novel methods of therapy using the power of technology. For instance, U.S. Food and Drug Administration (FDA) approved the H200 Wireless Hand Rehabilitation System, which is available to purchase in commercial markets.
It is a wireless device stimulating muscles in the forearm and hand. As of now, this is the only commercially available product for the hand muscles according to this research paper reviewing devices for stroke rehabiliation for the lower limbs. On the contrary, Biomove 3000, Hand Mentor PRO, mPower 1000 and NeuroMove are other devices developed in the past that are still available to purchase.
There is more hope and promise for survivors losing feeling in their arms as extensive research is in progress. Another new device meant to be worn like a glove was developed as a prototype by researchers in Stanford University and Georgia Tech.
What is the latest prototype?
The innovation was the brainchild of a graduate student from Georgia Tech, Caitlyn Seim. She invented the glove to gently stimulate nerves for several hours a day to improve sensation in the arms and hands. The vibrating glove can be worn during normal day-to-day activities like shopping or listening to music.
Once the prototype was made, Seim showed it to her Stanford University professors to get help with further research and to eventually push the device to clinical testing. Maarten Lansberg, an associate professor of neurology, and Allison Okamura, a professor mechanical engineering at Stanford University, are on board with the project. After the glove showed positive results in pilot studies, the trio received a grant from the prestigious Wu Tsai Neurosciences Institute to expand their research.
Currently, the researchers are in the process of improving the device for more comfort and accessibility before starting clinical tests again.The long-term vision is to build a device capable of helping stroke survivors restore lost function in their arms and hands.
The trio are united by a strong curiosity and passion to help stroke survivors. Lansberg had been treating stroke patients as a medical doctor, and Okamura has done research on touch-based devices with the intention to help such patients.
Seim’s interest stems from building wearable computing devices like virtual goggles and smartwatches, but she now intends to use her expertise to benefit health care and accessibility. She will be joining Stanford as a postdoctoral fellow in the fall, and she’ll continue working on the glove.
The most obvious sign someone has survived a stroke is usually some trouble speaking or walking. But another challenge may have an even greater impact on someone’s daily life: Often, stroke survivors lose sensation and muscle control in one arm and hand, making it difficult to dress and feed themselves or handle everyday objects such as a toothbrush or door handle.
Now, doctors and engineers at Stanford and Georgia Tech are working on a novel therapy that could help more stroke survivors regain the ability to control their arms and hands – a vibrating glove that gently stimulates the wearer’s hand for several hours a day.
Caitlyn Seim, a graduate student at Georgia Tech, started the project in the hope that the glove’s stimulation could have some of the same impact as more traditional exercise programs. After developing a prototype, she approached Stanford colleagues Maarten Lansberg, an associate professor of neurology, and Allison Okamura, a professor of mechanical engineering, in order to expand her efforts. With help from a Wu Tsai Neurosciences Institute Neuroscience seed grant, the trio are working to improve on their prototype glove and bring the device closer to clinical testing.
“The concept behind it is that users wear the glove for a few hours each day during normal daily life – going to the supermarket or reading a book at home,” said Seim. “We are hoping that we can discover something that really helps stroke survivors.”
Reaching for new stroke treatments
Seim, Lansberg and Okamura’s goal is a tall order. Despite some individual success stories, the reality is that most stroke patientsstruggle to regain the ability to speak, move around and take good care of themselves.
“Stroke can affect patients in many ways, including causing problems with arm function, gait, vision, speech and cognition,” Lansberg said, yet despite decades of research, “there are essentially no treatments that have been proven to help stroke patients recover these functions.”
It was in that context that all three researchers independently started thinking about what they could do to improve the lives of people who’ve survived strokes. As the medical doctor in the bunch, Lansberg had already been treating stroke patients for years and has helped lead the Stanford Stroke Collaborative Action Network, or SCAN, another project of the Wu Tsai Neurosciences Institute. Okamura, meanwhile, has focused much of her research on haptic or touch-based devices, and in the last few years her lab has spent more and more time thinking about how to use those devices to help stroke survivors.
“Rehabilitation engineering provides a unique opportunity for me to work directly with the patients who are affected by our research,” Okamura said. “The potential to translate the kind of technology relatively quickly to a commercial product that can reach a large number of stroke patients in need of therapy is also very exciting.”
For her part, Seim’s interest in stroke stems from an interest in wearable computing devices – but rather than build more virtual reality goggles and smartwatches, Seim said she wants to apply wearable computing to the areas of health and accessibility, “areas which have some of the most compelling problems to me,” she said.
Growing a new idea
With that ambition in mind, Seim built a prototype vibrating glove that she hoped would stimulate nerves and improve both sensation and function in stroke survivors’ hands and arms. After collecting some promising initial data, Seim reached out to the Stanford team.
“Stanford has SCAN and StrokeNet, along with a community of interdisciplinary engineering and computing research, so I reached out to Maarten, and he was very supportive,” Seim said.
Now, Seim, Lansberg and Okamura are revising the glove’s design to improve its function and to add elements for comfort and accessibility. Then, they’ll begin a new round of clinical tests at Stanford.
Long term, the hope is to build something that helps stroke survivors recover some of the functions they have lost in their hands and arms. And if initial tests work out, Lansberg said, it’s possible the same basic idea could be applied to treat other complications associated with stroke.
“The glove is an innovative idea that has shown some promise in pilot studies,” Lansberg said. “If proven beneficial for patients with impaired arm function, it is conceivable that variations of this type of therapy could be developed to treat, for example, patients with impaired gait.”
This paper presents the design and development of a highly articulated, continuum, wearable, fabric-based Soft Poly-Limb (fSPL). This fabric soft arm acts as an additional limb that provides the wearer with mobile manipulation assistance through the use of soft actuators made with high-strength inflatable fabrics. In this work, a set of systematic design rules is presented for the creation of highly compliant soft robotic limbs through an understanding of the fabric based components behavior as a function of input pressure. These design rules are generated by investigating a range of parameters through computational finite-element method (FEM) models focusing on the fSPL’s articulation capabilities and payload capacity in 3D space. The theoretical motion and payload outputs of the fSPL and its components are experimentally validated as well as additional evaluations verify its capability to safely carry loads 10.1x its body weight, by wrapping around the object. Finally, we demonstrate how the fully collapsible fSPL can comfortably be stored in a soft-waist belt and interact with the wearer through spatial mobility and preliminary pick-and-place control experiments.
Recovering after a stroke isn’t easy, but Neofect is here to help patients track their rehabilitation progress with an innovative wearable solution.
At CES 2019, the company exhibited its Rapael Smart Glove, a high-tech rehab device that helps stroke patients improve their hand movements. The device also syncs with an app, where patients can play rehabilitation games and track milestones.
Neofect didn’t disclose a price for the Rapael Smart Glove, but customers can go on the company’s website to buy it. The Rapael Smart Glove is also available for clinics that need stroke rehabilitation equipment.
Using the Rapael Smart Glove is very easy: Gently slide on the device, connect to the Rapael App with a smartphone or tablet, and play a variety of rehabilitation games. The app’s fun games include virtual tennis matches and house painting, and they’re available in different levels to balance challenge and motivation. Plus, the Rapael App collects practice data for patients, so they can track their hand recovery progress.
With the Rapael Smart Glove, patients can practice hand exercises and improve dexterity over time. An advantage of the Rapael Smart Glove is that it can help stroke patients who might not have immediate access to hospitals or physical therapy facilities, so they can work on their hand movements without leaving home.
“We aim to help patients all around the world including, but not limited to, those unable to receive appropriate treatment due to economic or geographic reasons,” says Neofect’s website. “By providing rehab training products and services that are available anytime and anywhere, we are committed to improving patient’s rehab experiences and quality of life.”
New technologies mean we won’t just see and hear digital information. We’ll also feel it.
By Matthew Hutson 12.20.2018
In Steven Spielberg’s 2018 film Ready Player One, based on the 2011 book by Ernest Cline, people enter an immersive world of virtual reality called the OASIS. What was most gripping about the futuristic tech in this sci-fi movie was not the VR goggles, which don’t seem so far off from the headsets currently sold by Oculus, HTC and others. It was the engagement of a sense beyond sight and sound: touch.
Characters wore gloves with feedback that let them feel the imaginary objects in their hands. They could upgrade to full body suits that reproduced the force of a punch to the chest or the stroking of a caress. And yet these capabilities, too, might not be as far off as we imagine.
We rely on touch — or “haptic” — information continuously, in ways we don’t even consciously recognize. Nerves in our skin, joints, muscles and organs tell us how our bodies are positioned, how tightly we’re holding something, what the weather is like, or that a loved one is showing affection through a hug. Around the world, engineers are now working to recreate realistic touch sensations, for video games and more. Engaging touch in human-computer interactions would enhance robotic control, physical rehabilitation, education, navigation, communication and even online shopping.
“In the past, haptics has been good at making things noticeable, with vibration in your phone or the rumble packs in gaming controllers,” says Heather Culbertson, a computer scientist at the University of Southern California. “But now there’s been a shift toward making things that feel more natural, that more mimic the feel of natural materials and natural interactions.”
Take surgical robots, which allow doctors to operate from the other side of the world, or to manipulate tools too small or in spaces too tight for their hands. Numerous studies have shown that adding haptic feedback to the control of these robots increases accuracy and reduces tissue damage and operation time. Ones with haptic feedback also allow doctors to train on patients that exist only in virtual reality while getting the feeling of actual cutting and suturing. One of Culbertson’s students is currently developing dental simulators so that a dental student’s first mistaken drilling is not on a real tooth.
Getting a feel for what the robot under your command is doing would also be helpful for defusing bombs or extracting people from collapsed buildings. Or for repairing a satellite without suiting up for a spacewalk. Even Disney has looked into haptic telepresence robots, for safe human-robot interactions. They developed a system that has pneumatic tubes connecting a humanoid’s robotic arms with a mirror set of arms for a human to grasp. The person can manipulate the mirror bot to cause the first bot to hold a balloon, pick up an egg or pat a child on the cheeks.
On a smaller scale, the lab of roboticist Jamie Paik at the Swiss Federal Institute of Technology in Lausanne (EPFL) has developed a portable haptic interface called Foldaway. Devices about the size and shape of a square drink coaster have three hinged arms that pop up, meeting in the middle. (Stefano Mintchev, a postdoc in the lab, calls them “miniaturized origami robots.”) A small plastic handle can be stuck on top where the arms meet, creating a joystick that acts in three dimensions — and the arms push back, to give the user a sense of the objects they’re pushing against. In demos, the team has used the devices to control an aerial drone, squeeze virtual objects and feel the shape of virtual human anatomy.
There are certain challenges in grasping haptics that might seem insurmountable — for instance, how do you provide a sense of weight when grabbing and lifting weightless digital objects? But by studying neuroscience, engineers have managed to find a few workarounds. Culbertson and colleagues developed a device called Grabity for the gravity problem. It’s a kind of vise that one grips and squeezes to pick up virtual objects. Simply by vibrating in certain ways, it can produce the illusion of weight and inertia.
But “fooling the brain only goes so far,” says Ed Colgate, a mechanical engineer at Northwestern University who works in haptics. It’s sometimes easy to break haptic illusions. To his mind, in the long run engineers will need to recreate the physics of the real world — weight and all — as faithfully as possible. “That’s a really hard problem.”
Graspable devices often take advantage of kinesthetic sensations: feelings of movement, position and force mediated by nerves not just in our skin but also in our muscles, tendons and joints. Wearable devices, on the other hand, usually rely on tactile sensations — pressure, friction or temperature — mediated by nerves in the skin.
A variety of experimental devices are worn on the finger, pressing against the finger pad with different degrees of force as one touches objects in virtual reality. But a recent device provides the same sort of feedback without covering the finger pad. Instead, it’s worn where one might wear a ring and contains motors that stretch the skin underneath. That keeps the fingers free to interact with real-world objects while still sensing “virtual” ones — a useful feature for both games and serious applications.
In one test, a person could hold a real piece of chalk and feel pressure as they “wrote” on a virtual chalkboard by virtue of a haptic illusion: As they simultaneously saw the chalk contact the board and felt their skin stretched, they were fooled into feeling pressure in their fingertips.
More commonly, wearable haptic devices communicate through vibration. Culbertson’s lab, for example, is working on a wristband that guides the wearer by vibrating in the direction he or she needs to turn. And NeoSensory, a company founded by Stanford neuroscientist David Eagleman, is developing a vest with 32 vibratory motors that was showcased in an episode of HBO’s sci-fi series Westworld where it ostensibly helped characters identify the direction of approaching enemies.
One of the vest’s first real applications will be to translate sound into tactile sensation to make spoken language more intelligible to people with profound or complete hearing loss. Eagleman is also working on translating aspects of the visual world into vibrations for people who are blind. Other efforts involve more abstract information such as market and environmental data — instead of a grid indicating where things are spatially, a complex pattern of vibrations might indicate the prices of a dozen stocks.
Vibrating motors can be bulky, so some labs are developing more comfortable solutions. Paik’s lab at EPFL is working on a soft pneumatic actuator (SPA) skin — a sheet of flexible silicone less than 2 millimeters thick that’s dotted with tiny air pockets. They can be inflated and deflated independently dozens of times per second and thereby act as pixels — or “taxels,” for tactile elements — creating a grid of sensation. They might provide feelings of the kind the suits offer in Ready Player One, or feedback on the positioning of robots or prosthetic limbs. SPA skin is also embedded with sensors made of a new, corrosion-resistant metal alloy that allows the same skin to be used for computer input when the user squeezes it.
An even thinner haptic film — less than half a millimeter thick — is also in the offing, created by Novasentis and made of a new form of polyvinylidene fluoride plastic that balances strength, flexibility and electrical responsiveness. When the film is layered on one side of a sheet of flexible material and an electrical charge is applied, the film contracts and flexes the sheet, applying pressure against the skin. Novasentis is now providing the material to device manufacturers who are putting it into gloves for virtual reality and gaming.
“You can distinguish between water and sand and rock,” says Sri Peruvemba, the company’s vice president of marketing. VR designers could also create more abstract representations, such as sensation-delivered messages about the state of a game. “We can create a whole haptic language with our technology,” Peruvemba says.
Vibrations can produce another kind of haptic illusion: the sensation of pulling. If a device that vibrates back and forth parallel to the skin’s surface moves quickly in one direction and slowly back the other way, many times a second, it feels as if it’s tugging the skin in the first direction.
When users don Flyjacket, a wearable haptic device, they can control the flight path of aerial drones with their arms and torso and feel the pushback of gusts of wind. Take a look.
CREDIT: EPFL LIS VIA YOUTUBE
While most wearables use tactile sensation, they can also use the muscle-joint-tendon input of kinesthetic sensation. Engineers have developed robotic exoskeletons, a kind of scaffolding strapped to the body with sensors and motors, that can help paralyzed people walk, give soldiers super strength, and let people control robots at a distance. A lab at EPFL has developed the FlyJacket, which one wears with arms straight out to the sides, connected by pistons to the waist. It doesn’t look especially fly, but it allows people to control the flight of aerial drones by moving their arms and twisting their torsos. When the drone feels a gust of wind, you do too.
The final category of devices are touchable interfaces, such as smartphone screens that give a little bump when you click on an app. Culbertson’s work pushes beyond simple bumps and buzzes. She simulates texture on surfaces using what she calls “data-driven haptics.” Instead of writing complicated algorithms or physics models to generate vibrations that simulate real ones, she records what happens when something is dragged over different fabrics or other materials at different speeds and pressures. Then she has a surface play back the vibrations when a pen is dragged across it. Applications could include online shopping and virtual museums.
Touchable surfaces also allow types of illusions. For instance, Culbertson says, playing the sound of a button clicking as one taps a picture of a button makes it feel as if the button is actually clicking. Or making the screen appear to deform under one’s finger can make it feel softer. People construct perception by tying together sight, sound, touch, taste and smell — and, as Culbertson says, “It’s really easy to fool your brain if you have a mismatch between your senses.”
Realistic haptics for VR may forever be clunky and expensive. Or technology may eventually make Ready Player One look quaint. In either case, as we can see with baby steps such as the pervasive rumbling of video game controllers and endlessly vibrating phones and watches, haptic devices are here to stay, adding a new dimension to our digital lives.
Matthew Hutson is a freelance science writer in New York City who writes for Science, the Atlantic, Scientific American and other publications. He’s the author of The 7 Laws of Magical Thinking. He tweets at @SilverJacket.
The potential for wearable mechatronic systems to assist with musculoskeletal rehabilitation of the upper limb has grown with the technology. One limiting factor to realizing the benefits of these devices as motion therapy tools is within the development of digital control solutions. Despite many device prototypes and research efforts in the surrounding fields, there are a lack of requirements, details, assessments, and comparisons of control system characteristics, components, and architectures in the literature. Pairing this with the complexity of humans, the devices, and their interactions makes it a difficult task for control system developers to determine the best solution for their desired applications.
The objective of this thesis is to develop, evaluate, and compare control system solutions that are capable of tracking motion through the control of wearable mechatronic devices. Due to the immaturity of these devices, the design, implementation, and testing processes for the control systems is not well established. In order to improve the efficiency and effectiveness of these processes, control system development and evaluation tools have been proposed.
The Wearable Mechatronics-Enabled Control Software framework was developed to enable the implementation and comparison of different control software solutions presented in the literature. This framework reduces the amount of restructuring and modification required to complete these development tasks. An integration testing protocol was developed to isolate different aspects of the control systems during testing. A metric suite is proposed that expands on the existing literature and allows for the measurement of more control characteristics. Together, these tools were used ii ABSTRACT iii to developed, evaluate, and compare control system solutions.
Using the developed control systems, a series of experiments were performed that involved tracking elbow motion using wearable mechatronic elbow devices. The accuracy and repeatability of the motion tracking performances, the adaptability of the control models, and the resource utilization of the digital systems were measured during these experiments. Statistical analysis was performed on these metrics to compare between experimental factors. The results of the tracking performances show some of the highest accuracies for elbow motion tracking with these devices. The statistical analysis revealed many factors that significantly impact the tracking performance, such as visual feedback, motion training, constrained motion, motion models, motion inputs, actuation components, and control outputs.
Furthermore, the completion of the experiments resulted in three first-time studies, such as the comparison of muscle activation models and the quantification of control system task timing and data storage needs. The successes of these experiments highlight that accurate motion tracking, using biological signals of the user, is possible, but that many more efforts are needed to obtain control solutions that are robust to variations in the motion and characteristics of the user.
To guide the future development of these control systems, a national survey was conducted of therapists regarding their patient data collection and analysis methods. From the results of this survey, a series of requirements for software systems, that allow therapists to interact with the control systems of these devices, were collected. Increasing the participation of therapists in the development processes of wearable assistive devices will help to produce better requirements for developers.
This will allow the customization of control systems for specific therapies and patient characteristics, which will increase the benefit and adoption rate of these devices within musculoskeletal rehabilitation programs.
tenoexo is a compact and lightweight hand exoskeleton which has been developed in collaboration with Jumpei Arata at Kyushu University. The EMG-controlled device assists patients with moderate to severe hand motor impairment during grasping tasks in rehabilitation training and during activities of daily living. Its soft mechanism allows for grasping of a variety of objects. Thanks to 3D-rapid prototyping, it can be tailored to the each individual user.
Stroke, spinal cord injury and muscular atrophy are just few examples of diseases leading to persistent hand impairment. No matter the cause, the inability to use the affected hand in activities of daily living will affect independence and quality of life. Wearable robotic devices can support the use of the impaired limb in activities of daily living, and provide at-home rehabilitation training. In collaboration with the groups of Prof. Jumpei Arata at Kyushu University, Japan, and Gregory Fischer at Worcester Polytechnic Institute, USA, we have developed a highly compact and lightweight hand exoskeleton.
Our exoskeleton aims to assist patients in grasping tasks during physiotherapy and in activities of daily living such as eating or grooming. Various grasp types, intuitive control based on electromyography (Ryser et al., 2017) and numerous usability features should increase the independence of the user. The current prototype, RELab tenoexo, is fully wearable and consists of a lightweight hand module (148 g) as well as an actuation box including motors, power source and controllers (720 g), all located in a compact backpack. tenoexo’s remote actuation system (Hofmann et al., 2018) and its compliant 3-layered sliding spring mechanism (Arata et al., 2013) ensure safe operation and inherent adaptation to the shape of the grasped objects. The palmar side of the hand is minimally covered to allow for natural somatosensory feedback during object manipulation. The actuated thumb module allows for both opposition and lateral grasps. tenoexo is fabricated to a large extent by 3D-printing technology. With an underlying automatic tailoring algorithm it can be adapted to the individual user within a few minutes. The maximal fingertip force of 4.5 N per finger allows for grasping and lifting of most everyday objects, up to 0.5-liter water bottles.
Our current focus is on the evaluation of tenoexo with several individuals suffering from stroke or spinal cord injury and exploring its potential as both assistive and therapeutic device in these populations. In related projects, we are investigating intention detection through functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) to allow for cortically-triggered assistance. Our vision is to realize a thought-controlled robotic hand exoskeleton for upper limb therapy and assistance in the clinic and at home.
Pictures (source: ETH Zurich / Stefan Schneller)
Swiss National Science Foundation through the National Center of Competence in Research (NCCR) Robotics
Strategic Japanese-Swiss Cooperative Research Program on “Medicine for an Aging Society”
Nycz, Ch., Bützer, T., Lambercy, O., Arata, J., Fischer, G.S., and Gassert, R. (2016). Design and Characterization of a Lightweight and Fully Portable Remote Actuation System for Use with a Hand Exoskeleton. IEEE Robotics and Automation Letters, 1(2):976–983.
Arata, J., Ohmoto, K., Gassert, R., Lambercy, O., Fujimoto, H. and Wada, I. (2013). A new hand exoskeleton device for rehabilitation using a three-layered sliding spring mechanism.IEEE International Conference on Robotics and Automation, pp. 3902–3907.
Our hands are very important in our daily life. They are used for non-verbal communication and sensory feedback, but are also important to perform both fine (e.g. picking up paperclips) and gross (e.g. lifting heavy boxes) motor tasks. Decline of hand function in older adults as a result of age-related loss of muscle mass (i.e. sarcopenia) and/or age-related diseases such as stroke, rheumatoid arthritis or osteoarthritis, is a common problem worldwide. The decline in hand function, in particular grip strength, often results in increased difficulties in performing activities of daily living (ADL), such as carrying heavy objects, doing housework, (un)dressing, preparing food and eating.
New developments, based on the concept of wearable soft-robotic devices, make it possible to support impaired hand function during the performance of daily activities and intensive task-specific training. The ironHand and HandinMind systems are examples of such novel wearable soft-robotic systems that have been developed in the ironHand and HandinMind projects. Both systems are developed to provide grip support during a wide range of daily activities. The ironHand system consists of a 3-finger wearable soft-robotic glove, tailored to older adults with a variety of physical age-related hand function limitations. The HandinMind system consists of a 5-finger wearable soft-robotic glove, dedicated towards application in stroke. In both cases, the wearable soft-robotic system could be connected to a computer with custom software to train specific aspects of hand function in a motivating game-like environment with multiple levels of difficulty. By adding the game environment, an assistive device is transformed into a dedicated training device.
The aim of the current thesis is to define user requirements, to investigate feasibility and to evaluate the direct and clinical effects of a wearable soft-robotic system that is developed to support impaired hand function of older adults and stroke patients in a wide range of daily activities and in exercise training at home.