Background: As commercial motion tracking technology becomes more readily available, it is necessary to evaluate the accuracy of these systems before using them for biomechanical and motor rehabilitation applications.
Objective: This study aimed to evaluate the relative position accuracy of the Oculus Touch controllers in a 2.4 x 2.4 m play-space.
Methods: Static data samples (n=180) were acquired from the Oculus Touch controllers at step sizes ranging from 5 to 500 mm along 16 different points on the play-space floor with graph paper in the x (width), y (height), and z (depth) directions. The data were compared with reference values using measurements from digital calipers, accurate to 0.01 mm; physical blocks, for which heights were confirmed with digital calipers; and for larger step sizes (300 and 500 mm), a ruler with hatch marks to millimeter units.
Results: It was found that the maximum position accuracy error of the system was 3.5 ± 2.5 mm at the largest step size of 500 mm along the z-axis. When normalized to step size, the largest error found was 12.7 ± 9.9% at the smallest step size in the y-axis at 6.23 mm. When the step size was <10 mm in any direction, the relative position accuracy increased considerably to above 2% (approximately 2 mm at maximum). An average noise value of 0.036 mm was determined. A comparison of these values to cited visual, goniometric, and proprioceptive resolutions concludes that this system is viable for tracking upper-limb movements for biomechanical and rehabilitation applications. The accuracy of the system was also compared with accuracy values from previous studies using other commercially available devices and a multicamera, marker-based professional motion tracking system.
Conclusions: The study found that the linear position accuracy of the Oculus Touch controllers was within an agreeable range for measuring human kinematics in rehabilitative upper-limb exercise protocols. Further testing is required to ascertain acceptable repeatability in multiple sessions and rotational accuracy.
Current gaming and virtual reality platforms [
] that use motion-controlled interfaces offer an affordable and accessible method of tracking human kinematics. However, given that consumer-grade platforms are originally intended for playing video games and to immerse players in virtual environments, their tracking performance should be evaluated before they are employed as tools for biomechanical or clinical analysis [ ]. Previously tested rehabilitation protocols using commercial gaming technology such as Wii Motes (Nintendo Co, Ltd, Kyoto, Japan) to provide positional feedback for trunk compensation [ ] or a Kinect (Microsoft Corporation, Redmond, United States) to measure range and speed of motion for upper-limb exercises [ , ] have shown potential to be used as rehabilitation tools that could provide quantifiable changes in clients’ kinematic motor abilities to therapists. Other studies using accelerometers to track patterns in functional upper-limb movements were able to capture differences similar to those measured by clinical scales [ ] and found benefits from objective quantitative evaluations of changes in motor ability during therapy regimens, which can be collected from in-game progress reports [ ]. In addition, success has been found in translating kinematic upper-limb metrics to clinical Fugl-Meyer scoring [ ] and in detecting exercise repetitions via kinematic monitoring for telerehabilitation and at-home programs [ ]. Current clinical assessments for upper-limb motor function, such as the Fugl-Meyer Assessment and Wolf Motor Function Test, only provide low-resolution point-scores rated qualitatively by therapists, and kinematic analysis of upper-limb motion has been reported to be a useful addition to these clinical assessments [ ]. When measuring range of motion in a clinical setting, the goniometer is considered a gold-standard clinical measurement tool used by therapists [ ]. However, only static joint angles can be measured, and typically with some visual estimation and multiple testers [ ].
One of the latest (released December 2016) devices to be developed for interacting with virtual environments is the Oculus Touch (Oculus VR, LLC, Menlo Park, CA, United States) controller set. The controllers are peripheral accessories of the Oculus Rift virtual reality headset and are employed to track users’ hand movements. Their tracking system employs a proprietary algorithm that collects data from infrared sensors via constellation tracking [
] and inertial measurement units (IMUs). Given that the controllers are wireless, lightweight, low-cost devices that can be used to track a user’s hand position and orientation in 3-dimensional (3D) space, they could have the potential to be employed in rehabilitative and biomechanical motion-tracking applications. At the time of this study, there was no sufficient information about the tracking performance of the controllers provided by the manufacturer, and there is currently a lack of scientific papers employing a systematic approach to test their potential application as tools for motion-tracking data capture. As a result, in this study, we evaluated the tracking accuracy of the Oculus Touch controllers to present a preliminary evaluation that could be informative to the biomechanical and rehabilitation research community. The specific aim of the experiment was to quantify the relative positional accuracy of the Oculus Touch controllers in 3 spatial dimensions. As the controllers are intended for hand-held motion control, the evaluation setup was centered around the movement size for standing/sitting upper-limb reaching tasks.
An Oculus Touch controller (
), 2 Oculus Sensors, an Oculus Rift headset, and a computer running Windows 10 (Microsoft Corporation) were employed in this study.
A custom computer application was developed in Unity 2017 (Unity Technologies, San Francisco, United States) to capture and log the controller’s position during the experiment. The data capture was performed at the headset’s native frequency of approximately 90 Hz , using the Unity OVR Plugin package to access controller data. The virtual environment was set up over a 2.4 m x 2.4 m play-space in the x-z plane to be within the recommended manufacturer play area. This space consists of 16 commercial 600 mm square force/torque plates professionally installed on a subfloor of auto-levelling epoxy and flat to within 0.5 mm (
). The y-axis was only bounded by the camera sensors’ field of view limitations.
To ensure consistency, the Oculus Sensors were placed on the floor at 0.3 m along the front edge of the space and 1.2 m apart, equidistant from the centre line, for the entire experiment. The sensor heads were manually leveled and visually aligned to have parallel, front-facing fields of views. Both the sensors and controllers maintained an initial y-position of 0 at the floor—this would be equivalent to placing the sensors at table height and the controllers at hand height.
All measurements were taken by securing the right-hand Oculus Touch controller to a flat L-shaped jig (
) and resting it on the floor for 5 seconds. Initial calibration of floor height and play-space size and orientation was done through the official commercial Oculus setup client.
Figure 1. The right-side Oculus Touch controller. Left: front view. Right: top-down view.
Continue —> JBME – Determining the Accuracy of Oculus Touch Controllers for Motor Rehabilitation Applications Using Quantifiable Upper Limb Kinematics: Validation Study | Shum | JMIR Biomedical Engineering