The study delves into the possibilities of creating a display system that looks more like sunglasses than the bulky, goggle-like systems that are currently favored for virtual reality and mixed reality. The system could also build in a vision-correcting algorithm.
“If we ultimately wish to make a display the size of eyeglasses, we must build the functionality of eyeglasses into the display,” the research team writes in a blog item about the technology.
The key to the system is a type of near-eye holographic display that processes visual information from a real-world environment, adds layers of computer-generated information as desired, and then sends the resulting image into the eye.
The processing software would have to be powerful enough to adjust the signal, pixel by pixel, to accommodate the optics of the near-eye display as well as the optical corrections needed by the wearer.
“This is akin to using an independently customized, complex lens to form each point in the image,” the researchers say. To reduce the processing overhead, the glasses could track eye movements and sharpen the pixels specifically in the area where the wearer is looking.
“Combined with GPU-accelerated algorithms, we demonstrate real-time hologram generation at rates of 90-260 Hz on a desktop GPU,” they report. Those rates compare favorably with what the human eye can distinguish.
Microsoft emphasizes that the project relates to basic research into holographic displays, and is “not necessarily indicative of any Microsoft product roadmap.” But if the company ever starts including X-ray vision as a HoloLens feature, you’ll know the reason why.
Hat tip to Popular Mechanics.