Philips’ Azurion augmented-reality platform makes use of Microsoft’s HoloLens headsets to guide surgeons through an operation. (Philips Illustration)

Virtual reality? Augmented reality? Mixed reality? Today at a Seattle symposium, experts settled on extended reality, or XR, as the catch-all term for devices that put computer-generated visuals in front of your face. And they settled on health care as one of the most promising frontiers for XR.

“I believe health care is going to drive the mass adoption of XR,” Vinay Narayan, vice president of platform strategy and developer community at HTC Vive, said at XR Day, an event presented by the University of Washington’s Department of Human Centered Design and Engineering.

Why does Narayan believe that? He pointed out that health-care applications tend to be enterprise-level applications in a “high-friction” environment, where employees have to deal with loads of data as they make decisions. Health care is also an industry that touches everyone, amounting to $3.5 trillion in annual spending.

Narayan said that’s an attractive frontier for technologies like XR, which can streamline operations and bring about better outcomes.

As an example, he pointed to the University of California at San Francisco, which is using HTC Vive’s VR system to teach medical students how to operate on virtual-reality patients.

Surgeons could soon be using XR to operate on real-reality patients as well. For example, Microsoft’s HoloLens team has been working with Philips on an image-guided therapy platform called Azurion. The mixed-reality system matches up a patient’s flesh-and-blood anatomy with a computer-generated model that uses X-ray and ultrasound imaging to guide surgeons through medical procedures.

The Food and Drug Administration has cleared HoloLens-based medical systems for use in pre-operative surgical planning, and HoloLens also figures in a Cleveland clinical trial aimed at guiding surgeons through procedures to treat tumors.

On Thursday, the FDA is planning a public workshop in Washington, D.C., to discuss best practices for virtual and augmented reality in medicine.

“I think this is going to be really interesting,” said Bernard Kress, partner optical architect on Microsoft’s HoloLens team.

During his time at Google X Labs, Kress played a leading role in the development of Google Glass, the camera-equipped smart glasses that sparked a sensation (and scorn) when they came out in 2014. The scorn eventually forced Google to phase out its mass-market Glass prototype, but the glasses live on as an enterprise product.

Kress said the optics of smart glasses and headsets have come a long way since then. Among the innovations on tap are light-field smart glasses that let users shift their focus from far to near as they take in an XR scene. “That’s very important for surgical applications,” Kress said.

Simran Bhatia
Simran Bhatia, an undergraduate research assistant at the University of Washington, sets up a virtual robot-building demonstration during XR Day on the UW campus. (GeekWire Photo / Alan Boyle)

Reducing the bulk of XR headsets, and designing the lenses so that other people can see the eyes of the wearer, will be particularly important for applications in the operating room. “If you have an ‘Oh My God’ moment, it’s really difficult to share this data with nurses. … If you can’t see the surgeon’s eyes, well, something’s really missing there,” Kress said.

The design of XR environments will also have to be fine-tuned to suit the operating room rather than the gaming arcade. “Designers should start to focus on making systems that are calm, smartly fading into the background,” said Evie Powell, virtual reality engineer at Seattle-based Proprio.

Proprio is working on an XR platform that can guide a surgeon through an operative procedure, and let medical interns and other observers watch over the surgeon’s shoulder. “Surgeons are able to quickly and precisely integrate their pre-operative plan, and perform the procedure entirely in XR,” Powell said.

Perhaps the most important technology coming to the XR marketplace is eye-tracking, which opens the way to true hands-free interaction. “You will see it in every single headset,” Kress said.

Narayan agreed that eye-tracking is important, and not just for surgery. For example, it’s hard to imagine being able to work the controllers while you’re swinging at virtual balls in MLB Home Run Derby VR. That’s why his company developed Vive Pro Eye, which adds gaze-oriented menu navigation and hands-free control to the game.

“With eye-tracking, you stay in position, and actually cycle through the menus and take action with your eyes,” Narayan said.

He said eye-tracking will usher in whole new types of applications, such as biometric scanning, identity verification and banking in XR.

The advent of 5G mobile communication networks is also likely to enhance wireless XR experiences. “A true AR headset … has all these different technologies that are built on top of that. There’s a challenge in building a good headset,” Narayan said. “What 5G allows you to do is decouple some of the complexities into the network.”

XR devices could look a lot different once the next big wave of technology arrives, and perhaps the biggest challenge facing companies in the XR space is building the right surfboard for riding that wave. “What we’re going to see three years from now is being worked on today,” Narayan said.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.