Posts Tagged VICON

[ARTICLE] Validation of a Kinect V2 based rehabilitation game – Full Text

Abstract

Interactive technologies are beneficial to stroke recovery as rehabilitation interventions; however, they lack evidence for use as assessment tools. Mystic Isleis a multi-planar full-body rehabilitation game developed using the Microsoft Kinect® V2. It aims to help stroke patients improve their motor function and daily activity performance and to assess the motions of the players. It is important that the assessment results generated from Mystic Isle are accurate. The Kinect V2 has been validated for tracking lower limbs and calculating gait-specific parameters. However, few studies have validated the accuracy of the Kinect® V2 skeleton model in upper-body movements. In this paper, we evaluated the spatial accuracy and measurement validity of a Kinect-based game Mystic Isle in comparison to a gold-standard optical motion capture system, the Vicon system. Thirty participants completed six trials in sitting and standing. Game data from the Kinect sensor and the Vicon system were recorded simultaneously, then filtered and sample rate synchronized. The spatial accuracy was evaluated using Pearson’s r correlation coefficient, signal to noise ratio (SNR) and 3D distance difference. Each arm-joint signal had an average correlation coefficient above 0.9 and a SNR above 5. The hip joints data had less stability and a large variation in SNR. Also, the mean 3D distance difference of joints were less than 10 centimeters. For measurement validity, the accuracy was evaluated using mean and standard error of the difference, percentage error, Pearson’s r correlation coefficient and intra-class correlation (ICC). Average errors of maximum hand extent of reach were less than 5% and the average errors of mean and maximum velocities were about 10% and less than 5%, respectively. We have demonstrated that Mystic Isle provides accurate measurement and assessment of movement relative to the Vicon system.

Introduction

In the past decade and quite rapidly in the past five years, Natural User Interfaces (NUIs) and video games have grown in popularity in both consumer applications and in healthcare []. Specifically, physical rehabilitation (e.g., physical and occupational therapy) has embraced novel NUI applications in clinics, hospitals, nursing homes, and the community []. Robotic systems have long included game-based and NUI-based user interfaces and most robotic devices provide some form of physical assistance to the patient and/or haptic feedback []. With the release of the Nintendo Wii in 2008, many NUI applications for healthcare moved away from bulky, expensive robotics and embraced the portable nature of movement and gesture recognition devices and systems. One of the biggest breakthroughs for this field came in 2010 when Microsoft released the Kinect sensor to accompany its Xbox console system. Within days and weeks of the Kinect’s release, hackers, universities, and companies began to exploit its markerless movement sensing abilities for educational and healthcare use. Since then, there has been an exponential increase in the number of studies that report the use of the Kinect as the input device for a NUI-based rehabilitation game or feedback application [].

In 2014, Jintronix was the first company to receive FDA approval for its rehabilitation game system that uses the Microsoft Kinect. There are a number of similar companies that utilize the Kinect sensor including SeeMee [], VirtualRehab [], Reflexion Health [], MIRA [], MotionCare360 [], and 5Plus Therapy []. Many of these systems are marketed for delivering rehabilitation therapy in the home setting. This type of delivery is termed “tele-rehabilitation” and can involve remote monitoring by the therapist or virtual sessions over teleconferencing software []. For telerehabilitation or remote sessions, it is imperative that the data the therapist receives from the system or movement-sensing device (such as the Microsoft Kinect) are accurate and reliable. If the therapist plans to use the data for documentation or for reimbursement from a health insurance company, the data ought to be as accurate as current clinical tools (e.g., goniometers).

Only one of the listed companies has validated the measurement capabilities of their systems and of the Microsoft Kinect. Kurillo and colleagues evaluated their system used in 5Plus Therapy against the Impulse motion-capture system (PhaseSpace Inc., San Leandro, CA) and found that it had good accuracy of joint positions and small to large percentage errors in joint angle measurements []. However, this study had a small sample size of only 10 subjects and used the first version of the Kinect sensor in its validation. Additionally, the movements used in the assessment were only within a single plane for each movement and all participants were seated during data collection.

Other researchers have validated the Kinect’s measurement and tracking capabilities for both general and specific applications. Hondori and Khademi [] provide an excellent summary of the work completed prior to 2014. It should be noted that all of these studies evaluated the first version of the Kinect. Following the release of the Kinect V2 sensor, most researchers have focused their validation efforts on gait and posture applications []. The Kinect V2 has good-to-excellent tracking and measurement capabilities for gait-specific parameters and clinical outcomes. However, many of these studies tracked only the lower limbs. Furthermore, gait is a relatively consistent, rhythmic motion that is consistent across participants, even in rehabilitation populations (i.e., one foot in front of the other). The full-body movements that participants are not limited to specific planes and could choose to use either hand have not been studied in current and prior comparisons of the Microsoft Kinect and optical marker-based motion capture systems.

We have developed software called Mystic Isle that utilizes the Microsoft Kinect V2 sensor as the input device []. Mystic Isle is designed as a rehabilitation game and has shown good results in improving motor function and daily activity performance in persons with chronic stroke []. The software initially used the first version (V1) of the Microsoft Kinect as the input device and we completed a study that compared it to the OptiTrack optical system []. Based on a visual analysis, we demonstrated that for the hand and elbow, the Kinect V1 has good accuracy in calculating trajectory of movement. For the shoulder, the Kinect V1 tracking abilities limit its validity. Although these findings are promising, the types and number of movements used in the study were limited to those in a seated position and mostly in one plane of movement (e.g., sagittal). Furthermore, the tracking capabilities of the Kinect V2 have substantially improved in the past 7 years and include more data points (joints) for comparison.

The current Mystic Isle game involves multi-planar, full body movements. Designed for individuals with diverse abilities, games can be played in a sitting or standing position, depending on the therapy treatment plan. In standing, the player is able to move around in the 3-dimensional space, akin to real-world rehabilitation. Few studies have evaluated the tracking and measurement capabilities of the Microsoft Kinect V2 for full-body, multi-planar movements in both sitting and standing. The purpose of this study was to determine the spatial accuracy and measurement validity of the Microsoft Kinect V2 sensor in a NUI rehabilitation game in comparison to a gold-standard marker-based motion capture system (Vicon).

Materials and methods

Participants

Participants were recruited via convenience sample at the University of Missouri- Columbia campus. Participants were included if they: 1) were over the age of 18, 2) could understand conversational English, and 3) had no medical conditions which prevented them from playing video games. The study has been approved by the Health Sciences Institutional Review Board at the University of Missouri with the approval number IRB 2005896 HS. All potential participants were screened and all subjects provided written informed consent before beginning the study.

Mystic Isle

Mystic Isle is a platform for rehabilitation that allows a user to interact with a virtual environment by using their body (Fig 1). The Mystic Isle software was created in Unity 3D and Mystic Isle allows the tracked user to interact with virtual environments and objects in a 3-D world. Using Mystic Isle, specific movements, distances, and locations of objects can be tailored to the abilities and requirements of the user. The system uses the Microsoft Kinect V2 camera to track participant movements. The Kinect V2 tracks 20 discrete points/joints on the body of the user. Both gross motor (stepping, jumping, squatting) and fine motor (waving the hand, turning the palm facing up, open/close hand) movements can be tracked. The Kinect V2 tracks the user in 3-dimensional space and then inputs the data in real time to the associated software, Mystic Isle. The Kinect V2 tracks and records the x, y, and z coordinates (and confidence) of each discrete joint at either 15 or 30 frames per second.

An external file that holds a picture, illustration, etc.Object name is pone.0202338.g001.jpg

Fig 1
Mystic Isle game environment.(a) A virtual avatar collecting targets in a Kinect-based rehabilitation game, Mystic Isle. (b) A participant playing the game with Vicon markers on the body. Joint data of game trials were recorded by a Kinect and the Vicon system for validation.

[…]

Continue —>  Validation of a Kinect V2 based rehabilitation game

, , , , , ,

Leave a comment

[Abstract] Accurate upper body rehabilitation system using kinect

Abstract:

The growing importance of Kinect as a tool for clinical assessment and rehabilitation is due to its portability, low cost and markerless system for human motion capture. However, the accuracy of Kinect in measuring three-dimensional body joint center locations often fails to meet clinical standards of accuracy when compared to marker-based motion capture systems such as Vicon. The length of the body segment connecting any two joints, measured as the distance between three-dimensional Kinect skeleton joint coordinates, has been observed to vary with time. The orientation of the line connecting adjoining Kinect skeletal coordinates has also been seen to differ from the actual orientation of the physical body segment. Hence we have proposed an optimization method that utilizes Kinect Depth and RGB information to search for the joint center location that satisfies constraints on body segment length and as well as orientation. An experimental study have been carried out on ten healthy participants performing upper body range of motion exercises. The results report 72% reduction in body segment length variance and 2° improvement in Range of Motion (ROM) angle hence enabling to more accurate measurements for upper limb exercises.

I. Introduction

Body joint movement analysis is extremely essential for health monitoring and treatment of patients with neurological disorders and stroke. Chronic hemiparesis of the upper extremity following a stroke causes major hand movement limitations. There is possibility of permanent reduction in muscle coactivation and corresponding joint torque patterns due to stroke [1]. Several studies suggest that abnormal coupling of shoulder adductors with elbow extensors and shoulder abductors with elbow flexors often leads to some stereotypical movement characteristics exhibited by severe stroke patients [2]. Therefore continuous and effective rehabilitation therapy is absolutely essential to monitor and control such abnormalities. There is a substantial need for home-based rehabilitation post-clinical therapy.

Source: Accurate upper body rehabilitation system using kinect – IEEE Xplore Document

, , , , , , , , , , , , , , , , , ,

Leave a comment

[ARTICLE] Quantitative Analysis of the Human Upper-Limp Kinematic Model for Robot-based Rehabilitation Applications – Full Text PDF

Abstract—Upper-limb robotic rehabilitation systems should inform the therapists for their patients status. Such therapy systems must be developed carefully by taking into consideration real life uncertainties that associate with sensor error. In our paper, we describe a system which is composed of a depth camera that tracks the motion of the patients upper limb, and a robotic manipulator that challenges the patient with repetitive exercises. The goal of this study is to propose a motion analysis system that improves the readings of the depth camera, through the use of a kinematic model that describes the motion of the human arm. In our current experimental set-up we are using the Kinect v2 to capture a participant who performs rehabilitation exercises with the Barrett WAM robotic manipulator. Finally, we provide a numerical comparison among the stand alone measurements from the Kinect v2, the estimated motion parameters of our system and the VICON, which we consider as an error-free ground truth apparatus.

I. INTRODUCTION

It is generally accepted that the role of modern physical rehabilitation is essential for the enhancement or restoration of inherent or incidental motor skills disorders. Such disorders may result from a variety of different causes such as amputation, spinal cord injury, musculoskeletal impairment and even brain injury. In light of this phenomenon, robotic rehabilitation augments classical rehabilitation techniques, from the scope that adaptable robotic devices, such as mechanical manipulators, can be used to complement the training routines of a physiatrist or occupational therapist. In this paper we describe and evaluate a novel system that can be used by physicians and therapists to monitor the state of the upper limbs of a patient who performs exercises. The system emphasizes the use of the Microsoft Kinect v2 as opposed to wearable sensors, such as embedded accelerometers, gyroscopes and EMGs. In the following sections we present, analyze and evaluate the proposed system. Specifically, in section 2 we discuss how related studies manage to tackle the problem of pose estimation with vision based or wearable sensors. Furthermore, we discuss how our system exploits the kinematic formulas that originate from the area of robotic mechanics and describe the motion or rigid bodies that can be abstracted via a kinematic chain. We also illustrate an overview of the system, address its core processes and state certain assumptionsthatleadtothesystemsrealization.Asexpected, in the last sections of the paper we detail the physical experimental setup for the assessment of the system and we consider possible avenues for future work.

Full Text PDF

, , , , , , ,

Leave a comment

%d bloggers like this: