Posts Tagged augmented reality

[WEB SITE] Electronic Caregiver Enters Clinical Trials With

 Dec. 7, 2018

PHOENIXDec. 7, 2018 /PRNewswire/ — Virtual Caregiving is set to enter both the healthcare industry and patient home settings. In January 2019, G60 Trauma ( will begin testing Addison Care, the world’s first, comprehensive virtual caregiving system to provide real time, 24/7 patient monitoring and care.

Dr. Alicia Mangram (center) and G60 Trauma staff visit the Electronic Caregiver tower in downtown Las Cruces, New Mexico.

Addison Care provides exciting new components to an interactive voice platform to demonstrate an interactive, augmented reality feature tied to visual sensing and connected home devices. Now, not only can you have a two-way conversation with an Electronic Caregiver, but the technology comes alive with an expertly designed augmented reality character named Addison, developed on AWS Sumerian. Addison provides a breakthrough user interface.

What can Addison do? In a clinical setting, Addison can greet a patient, recognized through facial recognition, conduct a verbal health examination, collect vitals, and even direct a comprehensive gait and balance session to determine the probability of a ground level fall in a particular patient. In the residential environment, Addison provides medication reminders, verifies medication consumption, provides medical test reminders, monitors vitals, demonstrates rehabilitation exercises, assesses a patient’s progress, mood, fall risk and responses to escalating conditions and emergencies including contacting responders or caregivers in time of patient need.

How does Addison work? A network of wireless visual sensors, local AI (artificial intelligence)-based processors, interactive tablets, Bluetooth biometric devices and emergency monitoring devices will be setup in a residence. Addison Care will be marketed and supported by a network of nationwide private duty home care providers that will serve as both live caregivers and Addison Care representatives. CEO of SDS, Anthony Dohrmann said, “Our goal is to expand affordable population health careto the masses, while lightening the burden on providers and payers. We are delivering an exciting new form of technology to patients and the active aging to improve their quality of life and health outcomes.” Addison will be making its debut at the Las Vegas Consumer Electronics Show January 8-11, 2019, Booth: Sands Convention Center Halls A-D – 42142.

Why partner with G60 Trauma Organization? Dr. Alicia Mangram, founder of G60 Trauma in Phoenix, Arizona, is a surgeon and acclaimed trauma specialist who has devoted her career to improving trauma care through advocacy, surgical and critical care research, education and community services. G60 Trauma is a specialized care program designed for trauma patients over the age of 60, with the goal of optimizing their recovery and safely discharging them back to their homes. This partnership will allow us to study hundreds of patients who have had a ground level fall and provide us with the data and information we need to continue producing products and services geared toward prevention and superior outcomes.

With an expert research team of professionals behind hundreds of successful research publications and processes, G60 Trauma team will be conducting an expansive study involving over 500 patients to document the effectiveness of Addison Care and Electronic Caregiver on improving patient outcomes, increasing patient and family satisfaction, reducing hospital readmission and reducing mortalities. Also, improving treatment adherence with the hope of validating a more effective, outcome based, continuum of care capable of reducing the long-term pressures and costs associated with long-term care and chronic disease management.

“The costs of treatment non-adherence have been reported to be as high as $300B annually and is noted as being responsible for 50% of all treatment failures. In a period of nursing and physician shortages, where home care is inadequate in frequency partly due to high cost, our hope is that Addison Care and Electronic Caregiver can fill the gap in patient care and bring better outcomes to the masses,” Dr. Alicia Mangram stated.

About SameDay Security, Inc. and Electronic Caregiver

SameDay Security (SDS) is one of the fastest growing monitored technology providers in the U.S. and one of only a handful of nationwide service providers. Known as the Electronic Caregiver CompanyTM and founded in 2009, SDS currently provides automated home care solutions and safety devices nationwide to thousands of clients. SDS has invested over $35,000,000 in patient screenings, research and development. SDS will disclose a new capital offering after CES to fuel new product launches and expansion. SDS has developing contracts with hundreds of home care partners across America who will participate in Addison Care marketing to their clients. New clinical trials are scheduled with G60 Trauma of Phoenix, Arizona, involving 500 patients over 3 years to determine the impact on patient outcomes, cost reduction, lower hospitalization, chronic disease management and long-term care. Electronic Caregiver employs over 70 employees and is headquartered in Las Cruces, New Mexico.

G60 is a specialized trauma care program developed by Dr. Alicia Mangram. Since 2009, Dr. Mangram has devoted her career to improving trauma care through advocacy, surgical and critical care research, education and community services. In the beginning of her career, she quickly realized that a traumatic injury in patients 60 years and older could occur from a simple fall resulting in a hip fracture. The traditional approach was to admit them to a medical facility and await medical clearance for pre-existing conditions such as diabetes, heart disease, etc. prior to any surgery.

While patients waited for medical clearance, other medical related complications could develop. Recognizing the cause of these complications lead to a paradigm shift and implementing an aggressive care approach for our G60 population. Through evidence-based research, Dr. Mangram and her team developed a care plan to address the needs of G60 trauma patients. These care plans achieved several goals, such as: Expedited early identification in the ER, admission to trauma service, alternative pain management modalities, for example, hip block, multidisciplinary care rounds with integration of the Biopsychosocial Model, evaluation of care approach through research and data analysis, achievement of optimal level of functioning and independence upon discharge.

Electronic Caregiver logo. (PRNewsfoto/Electronic Caregiver)

View original content to download multimedia:

SOURCE Electronic Caregiver


via Electronic Caregiver Enters Clinical Trials With | Markets Insider

, , , , , , , ,

Leave a comment

[WEB SITE] Augmented Reality At Spaulding Works To Help Patients Heal

It looks like Mark Priest is walking down the hall using arm crutches, but from his perspective, he says “I’m trying to chase Pac man. Keep my head steady, keep a nice cadence.”

Priest is putting the visor technology to the test. “The concept is to improve my gait and my stride by keeping a certain speed.”

The technology he is using could revolutionize the way people with catastrophic injuries are helped.

markpriest Augmented Reality At Spaulding Works To Help Patients Heal

“I have a spinal cord injury at level T-9 and T-12 of the vertebrae,” Priest explained.

The Inspire Lab, headed by Doctor Randy Trumbower, focuses on helping people move again. This year, his team of scientists created a new technology using augmented reality.

Trumbower said, “Augmented reality is a mix between what’s real and what’s not real…It’s a game changer for sure.”

doctor Augmented Reality At Spaulding Works To Help Patients Heal

That means computer generated images are superimposed over the real environment. For example, a patient can follow a Pac man around the room at changing speeds.

“Different grades, steps, obstacles, things that you maybe wouldn’t experience in a traditional therapy setting,” said Preist.

It also means that this therapy can be targeted. “The thing that is attractive about this particular technology is that it extends those benefits in a way that is more personalized,” said Trumbower.

augmentedreality Augmented Reality At Spaulding Works To Help Patients Heal

And ultimately it allows a patient to use that technology anywhere.

Priest said, “My goal is to improve my walking and get off the use of Loftstrand crutches and just be more independent in my day to day living.”

While Priest can only use the visor for short periods, he has still seen improvements. Friday was the first time he tried it without crutches.

“I’m really excited to see the advancement of this technology and how it can help.”

The lab is still in the early stages with the technology, but the promise is there and the work continues.


via Augmented Reality At Spaulding Works To Help Patients Heal « CBS Boston

, , , ,

Leave a comment

[WEB SITE] VR, AR and the NHS: How virtual and augmented reality will change healthcare – Video

Virtual reality (VR) and augmented reality (AR) technologies are still at an early stage, but both could have significant benefits for the NHS.

Healthcare organisations could be spending as much as $5bn globally on AR and VR by 2025 according to one prediction, with potential applications ranging from surgical simulation and diagnostic imaging to patient care and rehabilitation. VR headsets — like the Oculus Rift or HTC Vive — offer a fully immersive experience while AR headsets — like Microsoft HoloLens or Magic Leap — allow you to overlay virtual objects onto the real world to create a mixed-reality experience. Both options are being explored by doctors around the world.

And while countries with private healthcare systems are leading the way in VR adoption, countries dominated by publicly-funded healthcare are also exploring these technologies.

Surgeons in Poland have already demonstrated how Google Glass could be used to help plan heart procedures, and now NHS clinicians are following suit. Microsoft’s HoloLens has been used to help surgeons plan operations: for example, surgeons at Alder Hey hospital in Liverpool hope to use it to visualise patients’ scans during procedures, while three surgeons in three separate UK hospitals have used it for bowel cancer surgery.

SEE: Exomedicine arrives: How labs in space could pave the way for healthcare breakthroughs on Earth (cover story PDF)

A team of surgeons at Queen Mary’s Hospital have also been experimenting with wearing HoloLens headsets during surgery, overlaying a map of the patient’s anatomy — showing the path of blood vessels and the course of muscle groups — onto them during surgery. The map is created using CT scans of the patient, and allows the surgeons to navigate away from important structures during surgery. “There’s a lot of activity in this area now these types of devices are readily and commercially available,” says Philip Pratt, research fellow in the department of Surgery & Cancer at Imperial College London.

For example, surgeons who can visualise anatomy using the headsets can more easily avoid sensitive structures when making incisions, potentially reducing the time it takes for the patient to heal and therefore the time they spend in hospital, and similarly reduce the need for any secondary or corrective surgery.

Training up future doctors could also be a growth area for VR. While universities, rather than the NHS, are charged with training medical students, the vast majority of graduates from UK medical schools will go on to work in the NHS, and so their degrees are designed to fit the needs of the health service. Once qualified, junior doctors still receive significant amounts of training within their NHS hospital placements too.

There are already a handful of proof-of-concept anatomy teaching modules that use AR. One medical school in the US is planning to do away with its anatomy lab altogether in favour of using HoloLens.

And for junior doctors — those undertaking training in a medical speciality such as general surgery, psychiatry, or respiratory medicine — VR is likely to have an even greater role in their future training, helping them learn how to perform new procedures in a virtual hospital and even experience them from the patient’s point of view to help improve their communication skills. NASA has even considered using AR headsets to help astronauts conduct medical examinations aboard the International Space Station.

SEE ALSO: AI and the NHS: How artificial intelligence will change everything for patients and doctors

VR and AR won’t just be doctors’ tools, however: early experiments are showing that patients too could be seeing more virtual reality headsets in their future.

Companies like Oxford VR have been trialling the use of VR technology to help mental health patients. The University of Oxford spinout has been using simulated environments and a virtual coach to help people tackle their fear of heights.

The National Institute of Health Research is funding Oxford VR to the tune of £4m to develop a VR therapy package for patients with psychosis; it is also working on a package for young people with social anxiety. Commercial VR headsets are combined with custom software to create virtual versions of the environments that patients would typically find difficult, allowing them to explore those situations safely and, eventually, become comfortable with them. The NIHR said it believes the funding will help create a VR product that will be taken up by the NHS.

VR therapy has also been tested at King’s College and the South London and Maudsley trustfor improving auditory hallucinations in people with schizophrenia, and to help families affected by the Grenfell fire.

There have also been signs that virtual reality could be used for neurological, as well as psychiatric, conditions. Traumatic brain injury can leave people severely disabled, and can require intensive rehabilitation before they’re able to perform their day-to-day activities unassisted. It’s thought VR could be used for both assessing and treating traumatic brain injury. For example, avatars in VR have been used to assess patients’ higher brain functions or detect cognitive problems, while virtual kitchens have helped healthcare professionals assess how people with traumatic brain injury can undertake normal daily activities. VR has also been used to improve balance or attention after traumatic brain injury.

SEE: Executive’s guide to the business value of VR and AR (free ebook)

For now, however, most use of VR as a treatment tool is very much in the pilot stage, offered to relatively few people; larger trials with hundreds of participants would be needed before it would be possible to assess the benefits of such treatments for the population as a whole.

It’s worth remembering that even outside of the healthcare space VR and AR are at an early stage; the hardware is still cumbersome, the applications still evolving, and the pricing still way too high. As the general market evolves, ways to apply the technology to the NHS may become clearer. Sectors such as retail, for example, where spending on new technology is less constrained, will potentially act as examples of how the technology can be used in the healthcare.

“When I see AR or VR in the retail environment, I think it’s almost desensitising some of the use cases that the NHS can then take a more bold step towards,” Andrew Finlayson, managing director of Accenture Interactive, told ZDNet.

“I do think the more that gaming and those techniques are becoming popular, the more that those companies will make their technologies more accessible to the wider populace, and they will either drop the price or they will subsidise the headsets — make them cheaper so they can sell more games, or telecoms or media. I think gaming is bringing down some of the cost and accessibility that would allow more virtual and augmented reality [in the NHS].”

The success of AR and VR has been just around the corner for decades, but if the most recent crop of headsets proved a success with consumers and business, expect to see a lot more use of such technologies in the NHS in the near future.


via VR, AR and the NHS: How virtual and augmented reality will change healthcare | ZDNet

, , , ,

Leave a comment

[Thesis] Designing an augmented reality video game to assist stroke patients with independent rehabilitation


Early, intense practice of functional, repetitive rehabilitation interventions has shown positive results towards lower-limb recovery for stroke patients. However, long-term engagement in daily physical activity is necessary to maximise the physical and cognitive benefits of rehabilitation. The mundane, repetitive nature of traditional physiotherapy interventions and other personal, environmental and physical elements create barriers to participation. It is well documented that stroke patients engage in as little as 30% of their rehabilitation therapies. Digital gamified systems have shown positive results towards addressing these barriers of engagement in rehabilitation, but there is a lack of low-cost commercially available systems that are designed and personalised for home use. At the same time, emerging mixed reality technologies offer the ability to seamlessly integrate digital objects into the real world, generating an immersive, unique virtual world that leverages the physicality of the real world for a personalised, engaging experience.
This thesis explored how the design of an augmented reality exergame can facilitate engagement in independent lower-limb stroke rehabilitation. Our system converted prescribed exercises into active gameplay using commercially available augmented reality mobile technology. Such a system introduced an engaging, interactive alternative to existing mundane physiotherapy exercises.
The development of the system was based on a user-centered iterative design process. The involvement of health care professionals and stroke patients throughout each stage of the design and development process helped understand users’ needs, requirements and environment to refine the system and ensure its validity as a substitute for traditional rehabilitation interventions.
The final output was an augmented reality exergame that progressively facilitates sit-to-stand exercises by offering immersive interactions with digital exotic wildlife. We hypothesize that the immersive, active nature of a mobile, mixed reality exergame will increase engagement in independent task training for lower-limb rehabilitation.

via Designing an augmented reality video game to assist stroke patients with independent rehabilitation

, , , , , ,

Leave a comment

[WEB SITE] Augmented reality game helps stroke victims recover faster

Article Image

An augmented reality game that helps stroke victims recover. (Photo: Petrie, et al)

More than six million people worldwide die each year from strokes. Every two seconds, someone, somewhere is having one. Not all strokes are fatal, of course. In fact, 80 per cent of stroke victims survive, though many experience one or more serious lingering effects, including paralysis and cognitive and motor impairment. When a stroke occurs, areas of the brain are deprived of oxygen and neural pathways can become damaged. The good news is that the brain is a resourceful organ, and thanks to neural plasticity, it may be possible to relearn forgotten abilities through rehabilitation—targeted repetitive exercises—that helps the neurons re-organize themselves and allows the victim to regain function. The problem is that rehab is hard, and painful, and according to Regan David Petrie, some 69 per cent of stroke patients don’t get the recommended level of rehab activities. This is why the master student at Victoria University of Wellington has been developing an augmented reality (AR) mobile game, an “exergame,” whose purpose is to engage and reward stroke victims in order to keep them engaged in their therapy.

NZ Fauna AR

Petrie’s game was designed using Google’s Tango Augmented Reality platform prior to the search giant switching support to its newer, more consumer-oriented ARCore system. As the game’s player observes his or her surroundings through a mobile device, virtual 3D objects appear to set the scene and with which the player can interact.

AR in room

(Photo: Petrie, et al)

The game, still under development, is called NZ Fauna AR. As its name implies, it’s designed for stroke victims of New Zealand, leveraging their love of the country’s forests to provide a calming and enjoyable context in which play can occur. Fizzy, a virtual Rowi kiwi, is the AR star of the current iteration of the game.

Meet Fizzy AR

(Photo: Petrie, et al)

Players gather blueberries and feed them to Fizzy by performing sit-to-stand exercises, an important form of therapy for stroke victims. The most basic actions of the game are:

• standing up to throw berries to Fizzy

• sitting down to collect more berries from an AR bucket on the floor.

There are game controller buttons with interactive elements, but, says Petrie’s thesis, “The game was designed to incorporate minimal touch interactions—this was driven by the interaction model which was comprised of natural physical movements,” that is, standing up and sitting down.[…]

more —> Augmented reality game helps stroke victims recover faster | Big Think

, , ,

Leave a comment

[WEB SITE] 8 ways augmented and virtual reality are changing medicine

Israeli companies are using futuristic technologies to simplify complex surgery, manage rehab, relieve pain, soothe autistic kids and much more.

The Realview HOLOSCOPE-i augmented reality system for cardiac surgery. Photo courtesy of Business Wire

Spine and heart surgeons will use augmented reality (AR) to simplify complex procedures. Autistic children will get relief from sensory overload with a calming virtual reality (VR) system.

These and other scenarios are made possible by Israeli innovations tapping into the tremendous potential of AR and VR for healing and wellbeing.

The methods are similar: AR superimposes static and moving images to enhance an actual environment, while VR immerses the viewer in a simulated three-dimensional environment.

“Israel is on the frontlines in some areas of this technology,” says Orit Elion, a professor of physical therapy at Israel’s Ariel University, which hosted a conference last year to strengthen cooperation between AR and VR developers and researchers for health applications.

Elion helped develop a VR-based tele-rehab service at the Gertner Institute of Chaim Sheba Medical Center in Tel Hashomer, now used across Israel to enable monitored home physical or occupational therapy sessions for patients living far from healthcare centers.

“There aren’t so many programs in the world like this — a service that has no geographic boundaries,” Elion tells ISRAEL21c.

Currently, she is investigating how VR training can help with balance and fall prevention in the elderly. “VR is a dream for that, because you can manipulate the environment with all kinds of visual input,” she says.

Here are other examples of Israeli AR and VR in the health sector.


Surgical Theater makes a portfolio of VR products based on the notion that surgeons could train for complex procedures much like the Israeli founders of the company trained for Israel Air Force missions. Neurosurgeons at major medical centers and academic institutions in the United States and elsewhere are utilizing Surgical Theater’s VR medical visualization platforms for surgical planning and navigation, patient education and engagement, and training surgical residents.

Heart surgery

In the first quarter of 2018, RealView Imaging will release its long-awaited HOLOSCOPE-i, designed to deliver live, in-air 3D holographic visualizations during interventional cardiology procedures.

Powered by the Intel RealSense SR300-Series camera based on RealView’s proprietary digital light shaping technology, HOLOSCOPE-i is the first commercial system allowing clinicians full and direct control of 3D images in real time. Surgeons can rotate, zoom, slice, mark and measure within the floating holograms.

Coming next from RealView Imaging are HOLOSCOPE-x for visualization of holograms inside the patient during interventional oncology procedures, and a holographic headset for non-medical professional applications.

Spine surgery

Augmedics develops xvision, an AR head-mounted display for spine surgery that allows surgeons to see the patient’s anatomy through skin and tissue, as if they had “x-ray vision.” The system can project the patient’s anatomy, in real time, directly onto the surgeon’s retina, with the aim of increasing safety in surgery, reducing x-ray radiation and facilitating minimally invasive procedures.

Using xvision, surgeons will be able to visually and accurately track all their surgical instruments well within their field of vision as they work. A combination of proprietary tracking algorithms, hardware, software, an image data merging unit, and specialized instruments guide the surgeon through the operating site during major and minor procedures.

The xvision system will also utilize sensors to collect surgical information, which, when connected to a big data system, will analyze and process the data, using profound learning algorithms to provide alerts and suggestions to assist the surgeon during the procedure.

Augmedics has already performed pre-clinical cadaver trials in the US and EU. The company will start clinical studies in Q2 2018 in Israel, and later this year at the Johns Hopkins Hospital in Baltimore, Maryland.

Sensory modulation

Using VR goggles, the Calma system immerses an autistic child in a simulated underwater scene filled with corals, colorful fish, bubbles and divers.

“Children on the autism spectrum typically suffer from sensory moderation disorder, traditionally treated in a ‘white room’ where various objects are gradually introduced. This is costly and not always readily available. Our initiative simulates the white room with VR,” says Dan Kohen-Vacs, a senior computer science researcher at Holon Institute of Technology (HIT), where Calma was invented by students last year.

A management console allows the therapist to add, moderate or remove stimulants (including music) in response to the reaction of the child in real time. The goal is to train the child’s sensory regulation system to better handle auditory and visual stimulants and achieve emotional balance.

“We are completing the first proof-of-concept version and testing it in the Dekalim school in Jerusalem,” Kohen-Vacs tells ISRAEL21c. HIT’s tech-transfer company will work on commercializing the system.

“The plan is to expand to other locations. It may be possible to enable parents to use the system at home. You just need a smartphone and something like Google Cardboard that enables you to put the phone in it and wear it as headset,” says Kohen-Vacs.

Amit Bar-Tov, an occupational therapist at Dekalim, told Globes that the Calma pilot met with “great enthusiasm among the students for emotional regulation and sensory regulation, an improvement in learning capabilities, and a better connection with the environment.”

Burn rehab

Prof. Josef Haik, director of Sheba Medical Center’s Burn Center, has been using VR for more than a decade as a bedside tool to ease the painful process of rehabilitation from severe burns.

“It’s all about early mobilization and rehabilitation, getting back to the tasks of everyday life,” Haik tells ISRAEL21c.

In 2004, Sheba installed a large Computer Assisted Rehabilitation Environment (CAREN) system in a pioneering move toward VR in treatment and rehab. Burn patients couldn’t be moved to the CAREN room so Haik came up with an inexpensive portable alternative using EyeToy, a digital camera device for PlayStation. Today he’s using Kinect with games devised for patients with certain disabilities.

VR gaming therapy offer several advantages, says Haik: The games distract patients and thereby lessen their pain perception; allow patients to adapt to seeing and accepting the look of the scarred area of their body onscreen; and use rewards such as points to encourage continuation of therapy. Moreover, the patient does not have to wear or touch anything, eliminating any risk of cross infection.

Haik reported on the therapy in a 2006 study and has presented his approach to the American Burn Association and other associations around the world.

Stroke and traumatic brain injury  

The SeeMe VR rehab system was developed by physiotherapists at Beit Rivka Geriatric Rehabilitation Hospital in Petah Tikva in cooperation with Brontes Processing of Poland for stroke or traumatic brain injury patients.

It has been on the market since 2009, making it the first commercial VR system of its kind.

SeeMe’s technology transmits images to the patient’s computer via a Kinect controller or standard web camera and immerses the patient in a customized computer game requiring specific exercises set by the therapist.

The clinician can use the system to evaluate strength, endurance, range of motion, postural control, reaction time, proprioception, quality of movement, perception, divided attention and memory.

Parkinson’s disease and multiple sclerosis

Studies by scientists from the Technion-Israel Institute of Technology, Tel Aviv Medical Center and Tel Aviv University over the past decade have shown that incorporating VR headsets in gait training improved the walking abilities of people with multiple sclerosis and reduced fall risk in Parkinson’s patients. The latest study, published in Neurology in September, found that VR training actually modifies brain activation patterns in Parkinson’s patients.

PT and pain relief

Caesarea-based Motorika Medical’s ReoAmbulator robotic gait-training device helps adults and children improve walking, balance, coordination, posture or stamina while focusing on accomplishing VR tasks to improve motor or cognitive function including memory and selective attention. Combining these tasks in one session is meant to add a higher degree of challenge leading to better results. On the market since 2014, ReoAmbulator is used in two countries in Asia, five in Europe and in the United States — around 30 installations so far.

VRHealth of Tel Aviv and Boston is partnering with major players including Oculus, HTC and Microsoft to launch the first cross-platform-compatible VR medical application for rehab.

“We believe we are the only medical device company using an immersive headset as certified medical software,” founder Eran Orr tells ISRAEL21c. “What makes it a medical device is how you keep the data encrypted, how you can integrate electronic medical records and whether there is a billable insurance code for physicians to use. There is quality assurance and documentation for every app we are developing.”

VRHealth’s flagship VRPhysio software applications – two for neck therapy and one for shoulder therapy – got FDA clearance and are being implemented first in Spaulding Rehabilitation Hospital and Beth Israel Deaconess Medical Center in Boston. CE approval for Europe is expected soon.

VRHealth plans to launch additional products within a year: VRCoordi, which will work with VRPhysio to improve coordination skills, initially of children with developmental coordination disorder and various levels of autism; VRCogni to improve cognitive function in stroke, Alzheimer’s, concussion, Parkinson’s and dementia patients; VRReliever to manages chronic and severe pain through distraction; VRPsyc to enhances treatment for diagnosable mental disorders including general stress, phobias and anxieties, eating disorders, and PTSD.

“I hope our company will make a difference in the entire healthcare sector — every hospital, nursing home and assisted living — because VR can make a huge difference in many fields,” says Orr.

“I think Israel has a lot of potential in this technology because of the quality of engineers and developers able to develop products at a rapid pace and to be first to market and expand from there. That’s why we maintain our R&D in Israel.”

via 8 ways augmented and virtual reality are changing medicine | ISRAEL21c

, , , , , , , ,

Leave a comment

[WEB SITE] Augmented and virtual reality will involve human senses in verifying the operations of information systems — ScienceDaily

Many new applications aim to make information systems and machines identify their users and take their individual needs and emotions into account. VTT Technical Research Centre of Finland Ltd studied how ordinary consumers could reliably verify the operation of systems by using human senses.

In the future, machines and AI systems will have a deeper understanding of the actions of their human users. Even now, AI is able to generate an image of what a human is watching on the screen just by recording brain activity or deduce the emotions of people from microexpressions taken from their faces.

In the Human Verifiable Computing project, VTT used augmented and virtual reality to develop solutions for building trust between people and systems and facilitating the verification of information security. This is a vital aspect of the digital future, in which interaction between people and computers will be an effortless part of everyday life. “Augmented and virtual reality technologies let us make fuller use of our senses and enable the constant mutual evaluation of reliability between humans and machines,” says Senior Scientist Kimmo Halunen of VTT.

Making cryptographically verifiable computing available to human users was a key part of the project.

The project demonstrated functionalities involving computing verified with human senses. For example, augmented reality was utilized to distribute single-use passwords, which could then be used through voice recognition. Augmented reality was also utilized to give multisensory feedback by showing visual instructions to a maintenance worker who turns a valve and receives an error message if the valve is operated incorrectly. The message can be implemented as an interactive image and also presented through audio on the user’s smart glasses. In addition, haptic feedback can be provided by making the user’s smart watch or other mobile device vibrate.

The results of the project indicate that the basic technology required for the verification of computing with the human senses is already available. The combination of augmented reality and safety information will also enable new services. Current cryptographic methods and protocols are nearly always applied to communication between machines. Including the user in the interaction will nevertheless require more research and system and application development, as well as more study of human behaviour.

Story Source:

Materials provided by VTT Technical Research Centre of FinlandNote: Content may be edited for style and length.


via Augmented and virtual reality will involve human senses in verifying the operations of information systems — ScienceDaily

, , ,

Leave a comment

[Abstract] User experience and interaction performance in 2D/3D telecollaboration


Affordable 3D cameras, mixed reality headsets, and 3D displays have recently pushed the Augmented Reality (AR) and Virtual Reality (VR) technologies into the consumer market. While these technologies have been adopted in video-game and entertainment industry, the adoption for professional use, such as in industrial and business environment, health-care, and education is still lagging behind. In light of recent advances in mobile communications, AR/VR could pave the way for novel interaction and collaboration of geographically distributed users. Despite the technology being available, majority of communication is still accomplished using traditional video conferencing technology which lacks interactivity, depth perception, and ability to convey non-verbal cues in communication. 3D systems for communication have been proposed to overcome these limitations; however, very few studies looked into the performance and interaction with such technologies. In this paper, we report on a study that examined telecollaboration scenario with three different modalities: 2D video-conferencing, 3D stereoscopic interface, and 3D stereoscopic interface with augmented visual feedback. Twenty participants worked in pairs, assuming the roles of instructor and worker, to remotely interact and perform a set of assembly tasks.


via User experience and interaction performance in 2D/3D telecollaboration – ScienceDirect

, , , , ,

Leave a comment

[Abstract] Cloud-supported framework for patients in post-stroke disability rehabilitation.


Cloud-based rehabilitation services for post-stroke hand disability.

Tensor-based pattern recognition technique to detect the real-time condition of patient.

The integration of cloud computing with AR-based rehabilitation system.

Multi-sensory big data oriented tensor approach to handle patient’s collected data.


Given the flexibility and potential of cloud technologies, cloud-based rehabilitation frameworks have shown encouraging results as assistive tools for post-stroke disability rehabilitation exercises and treatment. To treat post-stroke disability, cloud-based rehabilitation offers great advantages over conventional, clinic-based rehabilitation, providing ubiquitous flexible rehabilitation services and storage while offering therapeutic feedback from a therapist in real-time during patients’ rehabilitative movements. With the development of sensory technologies, cloud computing technology integrated with Augmented Reality (AR) may make therapeutic exercises more enjoyable.

To achieve these objectives, this paper proposes a framework for cloud-based rehabilitation services, which uses AR technology along with other sensory technologies. We have designed a prototype of the framework that uses the mechanism of sensor gloves to recognize gestures, detecting the real-time condition of a patient doing rehabilitative exercises. This prototype framework is tested on twelve patients not using sensor gloves and on four patients wearing sensor gloves over six weeks. We found statistically significant differences between the forces exerted by patients’ fingers at week one compared to week six. Significant improvements in finger strength were found after six weeks of therapeutic rehabilitative exercises.

via Cloud-supported framework for patients in post-stroke disability rehabilitation

, , , , , , , , , ,

Leave a comment

[WEB SITE] Beyond haptics: blurring the line between your virtual avatar and your body

Main image credit: Teslasuit

In the space of a few short years, virtual reality has gone from being a technology of the future to part of the mainstream. Devices ranging from the humble Google Cardboard to the Oculus Rift have invaded our living rooms, and VR is set to transform everything from education to the sex industry.

But if VR is to achieve the mass appeal many are predicting then it needs to feel, as well as look, as real as possible, and not just like we’re passively watching a TV set strapped to our faces; the rest of our body needs to be as engaged as fully as our eyes.

Let’s get physical

Enter haptic technology, which allows us to literally feel what we’re experiencing in VR. You’ve likely come across haptic tech, sometimes referred to as just ‘haptics’, before, for example when you’ve played a video game and felt a rumble in the handset.

Now companies like Tesla Suit and Hardlight VR are bringing that experience to your whole body, with suits that can move and shake and vibrate in specific areas as you explore virtual worlds.

Design sketches for the Teslasuit, which enables the wearer to experience touch, heat and cold. Image credit: Teslasuit

But let’s slow down for a second. We have to because this haptic tech is far from becoming mainstream and, crucially, you can’t just put a haptic suit on someone and expect a VR experience to feel real.

That’s why there’s a lot of research going on into what’s known as ‘virtual embodiment’.

This is a complex and fairly new area of study, but it’s concerned with using technology, virtual representation, avatars, storytelling, haptics and all kinds of other subtle visual, auditory and sensory cues to make you feel like you’re inhabiting another body. Whether that’s an avatar of yourself, someone else or even something else.

Exploring the body/mind connection

Virtual embodiment might be a new area of study, but it’s built on research about the connection between our minds and our bodies that goes back more than a decade.

One example is what’s known as the ‘rubber hand illusion’. This was an experiment that essentially proved that, with the right stimuli, people took full ownership of a rubber hand as their own.

Fast-forward to the present day and similar studies have put the rubber hand illusion to the test in a VR setting.

In a 2010 study, researchers found that synchrony between touch, visual information and movement could induce a believable illusion that people actually had full ownership of a virtual arm.

Similar studies have looked at the efficacy of using avatars for rehabilitation and visual therapy, with research suggesting that, in most cases, our virtual bodies can feel as real as our physical bodies.

Defining virtual embodiment

To find out where the research is right now, we spoke to Dr Abraham Campbell, Chief Research Officer at MeetingRoom and Head of the VR Lab at University College Dublin.

“Virtual embodiment is a difficult thing to define as it can mean a lot to different people in different fields,” Campbell explained. He proposes that we look at virtual embodiment in three categories, all of which are a modified version of Tom Ziemke’s work on embodiment.

“Firstly, structural coupling is the most basic and classic definition of embodiment,” Campbell told us. “You’re connected to some form of structure. For example, a body. You move your limb in real life, and a virtual limb moves mimicking your actions…you are embodied within the VR world.”

Campbell offers the example of moving the HTC Vive controller in the real world, and that becoming a hand that’s moving in the virtual world.

Structural coupling – for example controlling virtual limbs with your own limbs – is the basic definition of embodiment within a virtual world. Image credit: Razer OSVR

Next up is historical embodiment, which is when the VR world you enter ‘remembers’ what’s happened in the past. Campbell uses the example of drawing on a white board, when what you’ve drawn stays there when you return in a day, a week or year from now.

“Finally, social embodiment is when you interact with real or artificial entities within the VR world,” Campbell says. “These interactions have to be behaviorally realistic, so you feel that your body is able to interact with them in the environment.”

And why is studying embodiment important? Campbell explains: “The more embodied the agent or human is within the environment, the more capable they are of interacting and sensing that world.”

Social interaction and education

Campbell’s main focus is on social collaboration in a recreational and educational setting.

“I’m examining the use of telepresence and VR in education, and exploring how I can remotely teach from Ireland to China using technology like Kinect [Microsoft’s motion-sensing input devices for its Xbox game consoles] to scan me in real time, while at the same time view the classroom in China using a set of projectors,” he said.

Bringing a teacher into a distant, virtual classroom will certainly be useful. But the next challenge is working on the intricacies of social interaction, such as facial expressions.

”Being able to convincingly communicate with others in a face-to-face way at a distance is one of the most exciting possibilities.”

Dr Gary McKeown

And although interacting with people may not sound like the most interesting use of virtual embodiment, it’s one that’s bound to get attention.

“It is also clearly an industry goal,” Dr Gary McKeown, Senior Lecturer at the School of Psychology at Queen’s University Belfast, tells us. “It is not a coincidence that the company with the most to gain from making the social interaction aspects of virtual reality function well – Facebook – is the one that bought Oculus.”

Remote instruction

Imagine being able to remotely control machinery, or just help out family fullyfrom thousands of miles away – it would change so much about work, commuting and social interaction.

This is one area that’s particularly interesting to Campbell is using embodiment research to aid telepresence or telerobotics, which is the use of virtual reality or augmented reality (AR) to do just that.

“I’m fascinated by Remote Expert, which is being pioneered by DAQRI,” he told us. “This allows an expert in a field to be remotely placed in augmented reality beside a non-expert to perform a complex task. DAQRI are looking at medical and industry fields to apply this technology, but you can imagine lots more applications.”

Campbell explained that one of the many uses for this kind of tech could be if an oil pipeline bursts, and the engineer who designed it is in another country. A local engineer could go out to fix the pipeline , with the designer advising them in real time using VR or AR and a stereo 360-degree camera.

A European Space Agency astronaut remotely guides a surface rover around a test site in California as part of NASA’s Surface Telerobotics program. Image credit: NASA

As we learned above, the tech enables the presence element of this. But where embodiment research comes in is making it more engaging, more realistic… ultimately more real, and with it the power to really offer help unhindered by technology.

Campbell explained: “The remote expert needs to be able to use hand gestures to demonstrate what the non-expert should do.

“The expert should be scanned in 3D in real time along with the remote world they are being placed into. This embodiment will allow them to truly be able to assist in whatever complex ask they’re asked to perform.”

The implications of this are massive, and could radically change a number of industries.

NASA already has a telerobotics research arm that’s looking at using this technology for space exploration, and it’s being introduced into other fields, from engineering to medicine.

Campbell believes this kind of telepresence will have a big impact on the medical industry as technology advances too.

“One solution I hope to explore in future is to use a full hologram projector pyramid,” he told us. “This approach has been suggested to me by medical professionals who want to meet patients remotely by using a full size projector pyramid [i.e. one that’s about two metres tall]. With this kind of tech, the doctor will be better able to diagnose a patient.”

Therapy and rehabilitation

Virtual embodiment doesn’t just have huge implications for exploring physical presence, but mental and emotional presence too.

In a 2016 study, researchers discovered that virtual avatars that look like our physical selves can help people feel a sense of embodiment and immersion that it’s believed could enable them to better work through mental health challenges, as well as real-world trauma.

VR software developer ProReal uses virtual environments containing avatars to play out scenarios that help people deal with a range of challenges, from bullying to PTSD and rehabilitation.

As the tech advances, it could provide a whole new area of therapy for those who aren’t getting the results they need from talking therapies or medication.

Avatars are useful for exploring different perspectives in complex therapeutic situations. Image credit: ProReal

But it’s not just more serious mental health challenges, like PTSD, that can be explored; avatars can be used to increase confidence or change our perception of ourselves. Campbell told us about the time he noticed that those with bigger avatars felt more powerful.

“One accidental discovery I had, when I looked at games in VR, was that when the avatar is on average one foot taller than the other characters, it makes the player feel more powerful than the computer-controlled characters,” he explained.

That observation mirrors a 2009 study in which researchers found that those given taller, bigger avatars behaved differently and more aggressively in interactions with others.

So aside from therapy and mental health use cases, it’s possible to imagine VR being used in corporate settings, to make people feel more confident before presenting to a boardroom.

The challenges of embodiment

If it’s easy for researchers to think of creative ways in which embodiment could have a positive impact on our lives, it’s not much of a leap to consider the negatives too.

Some tech commentators believe social isolation could be an issue as the use of VR headsets becomes more widespread and experiences become more immersive, a concern that’s likely to become more prevalent in gaming.

But many within the industry believe the focus on social isolation is just scaremongering.

“I haven’t witnessed people feeling isolation,” Campbell explained. “Even students who are interested in VR for pure escapism want to share it with others afterwards, and have become evangelists for VR in its ability to be an empathy machine, as with embodiment you can truly get a sense of seeing things from someone else’s perspective.”

Another important talking point is around dissociation or detachment from your own body after exploring virtual worlds. There’s been very little research in this area, but one study from 2006 found that VR can increase dissociative experiences, especially if you’ve been immersed in a virtual world for a long time.

Home – A VR Spacewalk, created by Rewind, lets you take a trip to the International Space Station. Image credit: Home A VR Spacewalk/Rewind

More than a decade on, and with better VR technology and content it’s no surprise that lots of anecdotal evidence points to a similar ‘post virtual reality sadness’, in which the real world doesn’t quite compare.

One potentially problematic side-effect Campbell thinks we do need to consider right away is addiction. But he explains that, unlike with traditional gaming addiction, VR can be designed differently.

“In VR, the user needs to replicate the real-world actions and thus shortcuts the traditional dopamine-hit reward cycle that people often become addicted to,” he told us.

So, for example, if you win a gaming level within VR you’ve likely put a lot of physical effort and exertion in, perhaps by killing the big bad boss at the end of the level. You’re likely to be physically tired. That’s the difference.

Of course you can still get addicted to that feeling – people get addicted to working out – but Campbell tells us: “It’s the responsibility of game designers to make sure that VR games reward a player for real effort and not make a game that’s hard at first to complete, but then actually gets easier.”

As tech and embodiment research advances, the potential of VR could know no bounds. Image credit:

We spoke to Sol Rogers, the Founder and CEO of creative agency and VR production studio Rewind, about some of these concerns.

“We’ve studied how humans interact in and with the real world for hundreds of years, and we’ll probably need the same amount of time to study how humans behave in VR and what the implications are,” he told us.

But he urged people to be excited about the prospect of what VR can do, not scared. “While we can only speculate about the impact VR will have, we need to progress with watchful caution rather than hysteria,” he added.

“Self-regulation from content creators is key, but a governing body also needs to take on some responsibility. Ultimately we need more research, and more time, to fully understand the implications.”

The recipe for greater embodiment

But embodiment is only convincing if everything else in the experience is up to scratch. We asked Rogers about how his team works with tech to create the most realistic experiences.

“Achieving a lifelike user experience in VR is now possible because of tremendous advancements in computer processing power, graphics, video and display technologies,” he told us.

“But the tech needs to stay out of the way; it needs to be entirely inconsequential to the experience, otherwise the spell is broken.”

”The tech needs to be entirely inconsequential to the experience, otherwise the spell is broken.”

Sol Rogers, CEO of Rewind

And he adds that the tech is only half of the equation; his job is to ensure the content is telling the best possible story, every step of the way. “Content is also key to creating presence. While the tech is no doubt important, no user is going to suspend disbelief if the experience is awful.”

From entertainment and social interaction to engineering and performing medical procedures, the more we understand, test and implement embodiment experiments, the more we can engineer experiences to feel real – and in turn be more effective.

With advances in research from the likes of Campbell and his team, along with advances in tech to make headsets slimmer, sensory feedback easier to implement and full-body holograms a reality, the sky is only virtually the limit.

This article is brought to you in association with Vodafone.

Source: Beyond haptics: blurring the line between your virtual avatar and your body | TechRadar

, , , , , ,

Leave a comment

%d bloggers like this: