Posts Tagged rehabilitation devices

[Abstract] A Preliminary Study: Mobile Device for Hand and Wrist Rehabilitation – IEEE Conference Publication

Abstract:

Task-specific rehabilitation has emerged as an influential approach to address the specific neurological problems. In particular, the recovery of hand and wrist functions of people suffering from hemiparesis and hemiplegia has appeared as a means of voluntary practices. In this study, a passive rehabilitation device has been designed to offer repetitive, low-cost, portable, easy-to-use human-machine interface for people who have limited hand-wrist mobility, also substantially decrease the therapist’s workload, and provide motivation and objective feedback to users. Therapy-based task-oriented virtual reality games are also accompanied with the proposed rehabilitation device to raise patient’s attention and motivation throught the therapy.

via A Preliminary Study: Mobile Device for Hand and Wrist Rehabilitation – IEEE Conference Publication

, , , , , , , , , , , , ,

Leave a comment

[ARTICLE] SITAR: a system for independent task-oriented assessment and rehabilitation

Over recent years, task-oriented training has emerged as a dominant approach in neurorehabilitation. This article presents a novel, sensor-based system for independent task-oriented assessment and rehabilitation (SITAR) of the upper limb.

The SITAR is an ecosystem of interactive devices including a touch and force–sensitive tabletop and a set of intelligent objects enabling functional interaction. In contrast to most existing sensor-based systems, SITAR provides natural training of visuomotor coordination through collocated visual and haptic workspaces alongside multimodal feedback, facilitating learning and its transfer to real tasks. We illustrate the possibilities offered by the SITAR for sensorimotor assessment and therapy through pilot assessment and usability studies.

The pilot data from the assessment study demonstrates how the system can be used to assess different aspects of upper limb reaching, pick-and-place and sensory tactile resolution tasks. The pilot usability study indicates that patients are able to train arm-reaching movements independently using the SITAR with minimal involvement of the therapist and that they were motivated to pursue the SITAR-based therapy.

SITAR is a versatile, non-robotic tool that can be used to implement a range of therapeutic exercises and assessments for different types of patients, which is particularly well-suited for task-oriented training.

The increasing demand for intense, task-specific neurorehabilitation following neurological conditions such as stroke and spinal cord injury has stimulated extensive research into rehabilitation technology over the last two decades.1,2 In particular, robotic devices have been developed to deliver a high dose of engaging repetitive therapy in a controlled manner, decrease the therapist’s workload and facilitate learning. Current evidence from clinical interventions using these rehabilitation robots generally show results comparable to intensity-matched, conventional, one-to-one training with a therapist.35 Assuming the correct movements are being trained, the primary factor driving this recovery appears to be the intensity of voluntary practice during robotic therapy rather than any other factor such as physical assistance required.6,7 Moreover, most existing robotic devices to train the upper limb (UL) tend to be bulky and expensive, raising further questions on the use of complex, motorised systems for neurorehabilitation.

Recently, simpler, non-actuated devices, equipped with sensors to measure patients’ movement or interaction, have been designed to provide performance feedback, motivation and coaching during training.812 Research in haptics13,14 and human motor control15,16 has shown how visual, auditory and haptic feedback can be used to induce learning of a skill in a virtual or real dynamic environment. For example, simple force sensors (or even electromyography) can be used to infer motion control17and provide feedback on the required and actual performances, which can allow subjects to learn a desired task. Therefore, an appropriate therapy regime using passive devices that provide essential and engaging feedback can enhance learning of improved arm and hand use.

Such passive sensor-based systems can be used for both impairment-based training (e.g. gripAble18) and task-oriented training (ToT) (e.g. AutoCITE8,9, ReJoyce11). ToT views the patient as an active problem-solver, focusing rehabilitation on the acquisition of skills for performance of meaningful and relevant tasks rather than on isolated remediation of impairments.19,20 ToT has proven to be beneficial for participants and is currently considered as a dominant and effective approach for training.20,21

Sensor-based systems are ideal for delivering task-oriented therapy in an automated and engaging fashion. For instance, the AutoCITE system is a workstation containing various instrumented devices for training some of the tasks used in constraint-induced movement therapy.8 The ReJoyce uses a passive manipulandum with a composite instrumented object having various functionally shaped components to allow sensing and training of gross and fine hand functions.11 Timmermans et al.22reported how stroke survivors can carry out ToT by using objects on a tabletop with inertial measurement units (IMU) to record their movement. However, this system does not include force sensors, critical in assessing motor function.

In all these systems, subjects perform tasks such as reach or object manipulation at the tabletop level, while receiving visual feedback from a monitor placed in front of them. This dislocation of the visual and haptic workspaces may affect the transfer of skills learned in this virtual environment to real-world tasks. Furthermore, there is little work on using these systems for the quantitative task-oriented assessment of functional tasks. One exception to this is the ReJoyce arm and hand function test (RAHFT)23 to quantitatively assess arm and hand function. However, the RAHFT primarily focuses on range-of-movement in different arm and hand functions and does not assess the movement quality, which is essential for skilled action.2428

To address these limitations, this article introduces a novel, sensor-based System for Independent Task-Oriented Assessment and Rehabilitation (SITAR). The SITAR consists of an ecosystem of different modular devices capable of interacting with each other to provide an engaging interface with appropriate real-world context for both training and assessment of UL. The current realisation of the SITAR is an interactive tabletop with visual display as well as touch and force sensing capabilities and a set of intelligent objects. This system provides direct interaction with collocation of visual and haptic workspaces and a rich multisensory feedback through a mixed reality environment for neurorehabilitation.

The primary aim of this study is to present the SITAR concept, the current realisation of the system, together with preliminary data demonstrating the SITAR’s capabilities for UL assessment and training. The following section introduces the SITAR concept, providing the motivation and rationale for its design and specifications. Subsequently, we describe the current realisation of the SITAR, its different components and their capabilities. Finally, preliminary data from two pilot clinical studies are presented, which demonstrate the SITAR’s functionalities for ToT and assessment of the UL. […]

Continue —> SITAR: a system for independent task-oriented assessment and rehabilitation Journal of Rehabilitation and Assistive Technologies Engineering – Asif Hussain, Sivakumar Balasubramanian, Nick Roach, Julius Klein, Nathanael Jarrassé, Michael Mace, Ann David, Sarah Guy, Etienne Burdet, 2017

Figure 1. The SITAR concept with (a) the interactive table-top alongside some examples of intelligent objects developed including (b) iJar to train bimanual control, (c) iPen for drawing, and (d) iBox for manipulation and pick-and-place.

, , , , , , , , , , ,

Leave a comment

[ARTICLE] Multimodal robotic system for upper-limb rehabilitation in physical environment – Full Text HTML

Abstract

This article researches the feasibility of use of a multimodal robotic system for upper-limb neurorehabilitation therapies in physical environments, interacting with real objects. This system consists of an end-effector upper-limb rehabilitation robot, a hand exoskeleton, a gaze tracking system, an object tracking system, and electromyographic measuring units. For this purpose, the system architecture is stated, explaining the detailed functions of each subsystem as well as the interaction among them. Finally, an experimental scenario is designed to test the system with healthy subjects in order to check whether the system is suitable for future experiments with patients.

Introduction

The use of robotic systems in neurorehabilitation therapies may be justified because of its potential impact on better treatment and motor learning.1 For this reason, in the recent years, a wide variety of robotic devices for upper-limb neurorehabilitation have been developed by research groups around the world.211

In conjunction with these robotic devices, a wide range of robot-oriented rehabilitation interfaces and environments have been stated. Many of the current devices use virtual reality systems to set up the rehabilitation context;1217 and just few examples use physical environments.18,19 It should be pointed out that all these examples, except Badesa et al.’s14 work, use robotic exoskeletons.

Virtual reality systems are specially suitable for early stages of the disease,20 due to the flexibility that it offers when designing tasks and feedback stimuli, and the safety that it provides due to the absence of interaction with physical objects that can lead to injuries. However, in order to obtain a realistic interaction, it is necessary to use haptic devices,2124 which result in expensive and complex systems. In contrast, physical environments may suppose a good and inexpensive alternative to perform more complex, and functional, rehabilitation tasks in later stages of the disease, when patients have recovered some motor control of their upper limb.

The objective of this article is to check whether an end-effector rehabilitation robot25can be used to develop a fully functional multimodal rehabilitation system in physical environments. In contrast to Frisoli et al.’s19 work, the use of an end-effector robot instead of an exoskeleton is expected to result in a considerable reduction in the setup time as well as in an increase in user’s comfort. Additionally, the brain–computer interface (BCI) is replaced by electromyography, which does not require previous training, reducing user’s mental fatigue26 and saving additional time.

In this regard, the experimentation will focus on testing whether the mechanical system can be controlled with precision and safety enough to interact with some objects and perform a simple occupational therapy activity successfully, so that further researches in this path can be done.

Multimodal architecture

The starting point is an already designed upper-limb neurorehabilitation robot, which was conceived to deliver therapies in virtual reality environments, during the early stages after stroke.

In order to achieve the stated objectives, a multimodal architecture has been stated so that users can use a combination of their residual capabilities to perform the task. Among the possible remaining skills that patients may keep, eye movement and electromyographic (EMG) signals have been chosen for these tests.

Specifically, the designed system is composed of the following:

  • An object tracking system, which gives the position of the object that will be handled.

  • A gaze tracking device that will determine which object the patient is looking at.

  • EMG measuring units used as a trigger of several actions.

  • An end-effector rehabilitation robot that will assist the patient to perform reaching movements.

  • A hand exoskeleton for grasping the objects.

  • A computer that implements the high-level control (HLC) system, which will process and coordinate the signals of each device and will determine the control actions.

Communication and relationship between each element are stated in Figure 1.

Figure 1.

Figure 1. System architecture and communications between components.

Continue —> Multimodal robotic system for upper-limb rehabilitation in physical environment

, , , , , , , ,

Leave a comment

[ARTICLE] Multimodal robotic system for upper-limb rehabilitation in physical environment – Full Text HTML

Abstract

This article researches the feasibility of use of a multimodal robotic system for upper-limb neurorehabilitation therapies in physical environments, interacting with real objects. This system consists of an end-effector upper-limb rehabilitation robot, a hand exoskeleton, a gaze tracking system, an object tracking system, and electromyographic measuring units. For this purpose, the system architecture is stated, explaining the detailed functions of each subsystem as well as the interaction among them. Finally, an experimental scenario is designed to test the system with healthy subjects in order to check whether the system is suitable for future experiments with patients.

Introduction

The use of robotic systems in neurorehabilitation therapies may be justified because of its potential impact on better treatment and motor learning.1 For this reason, in the recent years, a wide variety of robotic devices for upper-limb neurorehabilitation have been developed by research groups around the world.211

In conjunction with these robotic devices, a wide range of robot-oriented rehabilitation interfaces and environments have been stated. Many of the current devices use virtual reality systems to set up the rehabilitation context;1217 and just few examples use physical environments.18,19 It should be pointed out that all these examples, except Badesa et al.’s14 work, use robotic exoskeletons.

Virtual reality systems are specially suitable for early stages of the disease,20 due to the flexibility that it offers when designing tasks and feedback stimuli, and the safety that it provides due to the absence of interaction with physical objects that can lead to injuries. However, in order to obtain a realistic interaction, it is necessary to use haptic devices,2124 which result in expensive and complex systems. In contrast, physical environments may suppose a good and inexpensive alternative to perform more complex, and functional, rehabilitation tasks in later stages of the disease, when patients have recovered some motor control of their upper limb.

The objective of this article is to check whether an end-effector rehabilitation robot25can be used to develop a fully functional multimodal rehabilitation system in physical environments. In contrast to Frisoli et al.’s19 work, the use of an end-effector robot instead of an exoskeleton is expected to result in a considerable reduction in the setup time as well as in an increase in user’s comfort. Additionally, the brain–computer interface (BCI) is replaced by electromyography, which does not require previous training, reducing user’s mental fatigue26 and saving additional time.

In this regard, the experimentation will focus on testing whether the mechanical system can be controlled with precision and safety enough to interact with some objects and perform a simple occupational therapy activity successfully, so that further researches in this path can be done.

 

Continue —> Multimodal robotic system for upper-limb rehabilitation in physical environment

 

Figure 1.

Figure 1. System architecture and communications between components.

 

Figure 2.

Figure 2. Difference between hand position and end-effector position with respect to the reference frame of the end-effector.

, , , ,

Leave a comment

[BOOK] New Trends in Medical and Service Robots: Assistive, Surgical and Educational Robotics – Google Books

Front CoverMedical and Service Robotics integrate the most recent achievements in mechanics, mechatronics, computer science, haptic and teleoperation devices together with adaptive control algorithms.

The book includes topics such as surgery robotics, assist devices, rehabilitation technology, surgical instrumentation and Brain-Machine Interface (BMI) as examples for medical robotics.  Autonomous cleaning, tending, logistics, surveying and rescue robots, and elderly and healthcare robots are typical examples of topics from service robotics.

This is the Proceedings of the Third International Workshop on Medical and Service Robots, held in Lausanne, Switzerland in 2014. It presents an overview of current research directions and fields of interest. It is divided into three sections, namely

  1. assistive and rehabilitation devices;
  2. surgical robotics; and
  3. educational and service robotics.

Most contributions are strongly anchored on collaborations between technical and medical actors, engineers, surgeons and clinicians. Biomedical robotics and the rapidly growing service automation fields have clearly overtaken the “classical” industrial robotics and automatic control centered activity familiar to the older generation of roboticists.

Source: New Trends in Medical and Service Robots: Assistive, Surgical and … – Google Books

, , ,

Leave a comment

[THESIS] Design of Customized Rehabilitation Devices and Bench Testing System – Full Text PDF

ABSTRACT

Off-the-shelf rehabilitation devices are currently prescribed to assist patients with stroke. Current fabrication processes of custom-made rehabilitation devices are time consuming and laborious. The process could be only performed by skilled therapists. In addition, quantitative assessment of mechanical properties is crucial in the design of customized rehabilitation devices. By the design and the real time implementation of a 3D printed hand exoskeleton and a biomimetic testbed for AnkleFoot Orthoses (AFOs), the improved digitalized methodologies of design and bench testing systems for customized rehabilitation devices were presented in this study.

A customized 3D printed hand exoskeleton (the EXCELSIOR) was developed and prototyped to assist stroke patients for finger extension exercises. 3D printing was combined with 3D scanning to create a custom-fit clamp. Compliant finger elements were designed and optimized utilizing Finite Element Analysis. Embedded strain gauges were applied to measure angular positions of the finger joints. In addition, a novel biomimetic testbed was designed to perform stiffness measurement and functional analysis for AFOs.

A biomimetic footplate was designed to adjust pivot centers for the metatarsophalangeal (MTP) joint and the ankle joint according to the patient specific anatomy. Feedback control systems were developed and real time implemented to perform AFO stiffness measurement. An impedance control system was developed and real time implemented to simulate the kinematics of the human ankle for further functional analysis in gait. Real time implementation of the hand exoskeleton and AFO testbed proved the concepts of the design and the testing for customized rehabilitation devices.

Full Text PDF

, , , , , , , , ,

Leave a comment

%d bloggers like this: