Google caused a big stir last week with the unveiling of its Project Glass augmented reality project, including a video showing its vision of the future and a concept hardware design of the glasses themselves.

A prototype of Google's glasses

One of the people watching closely was Randall Sprague, the co-founder of Seattle-area  startup Innovega and the former chief engineer of Bothell-based Microvision.

Innovega, which showed its technology at the Consumer Electronics Show this year, is developing contact lenses that use optical filtering nanotechnology to let people see images projected onto glasses at close range by pico projectors. The company, supported by the National Science Foundation and others, sees the technology as an alternative to bulkier headgear commonly used for augmented reality.

So what does Sprague think of the Google announcement?

“For me, it’s all really positive, because you’ve got somebody with the attention-grabbing ability of Google who’s out promoting the same thing that Innovega is trying to develop,” he said via phone today. “Innovega is a small company, so we’re focusing on the hardware which enables all of this, and we’re going to need people like Google to develop the applications and have the hype and PR ability to promote this. So it’s good for Innovega to have somebody like Google promote what we’re doing.”

Randall Sprague, left, and Stephen Willey, founders of Innovega, at the Consumer Electronics Show earlier this year.

OK, but what did he really think of the video and prototype Google showed? Continue reading for his thoughts …

Google’s augmented reality video: Google did two things last week. They released a video that shows their vision of what augmented reality can be and will be some day. This is similar to a video that Nokia put out, BMW has a really nice video showing augmented reality to help repair a car. Actually one of the better ones I’ve seen is done by a pseudo-company called Stark Industries, which is actually a Marvel Comics fictional company. I’ve actually had people who thought that Stark HUD was real. No, no, that’s Marvel Comics.

And people think that this Google video is real. Google did say that it’s just a concept of what AR will be some day. But they didn’t go out of their way to say, this is a fictional, theatrical film that you’re about to see.

Google co-founder Sergey Brin wearing the prototype glasses last week. (Photo by Thomas Hawk, via Flickr)

Thoughts on Google’s hardware design: The other thing that Google did is they showed their progress on their Project Glass, which is pretty much an old-school eyewear device. A really tiny display. They’ve mounted it up and to the right, which is kinda odd because that’s the worst possible place to put a display to actually be able to see it. But it has the advantage in that you can make a nice-looking display product. If you’re not constrained by function, you can actually make it look pretty nice.

So this is an old-school, tiny-optic, tiny-display device. And it has no correlation to the video that they showed. The video shows full-field wide imagery, right in front of your eyes. People think that the video is imagery that this eyewear would support, and they’re not at all related.

Google could easily move that optic in front of the eye so that it was looking straight out. Now, it’s still going to be a tiny display, so you’re still going to have a tunnel vision of where the digital information is. They could do that, but I think the reason they didn’t do that is because then it wouldn’t look so cool. People have been doing eyewear for 40 years, and everybody knows that’s not where you put your display. They either were naive about it or, more likely, the whole point was not to produce a functional display but to make a display that looks good in pictures.

Innovega's technology uses contact lenses with optical filtering nanotechnology to let people see images projected onto glasses, overlaying graphics on the real world.

Nobody would have paid any attention to the Google glasses except they also coupled it to this video. A lot of people think you get this big display with this small little panel, but photons can only enter your eye from the direction from which they originate. If it’s just a small panel off to the side, that’s the only area of your vision that photons can come from. So it’s definitely a tiny window that they’ve got.

Another technical detail: If you look at the picture, that glass looks fairly thick. They don’t tell you how they’re doing it, but there is an architecture that can be used as a beam splitter. I don’t know if they’re doing that or not, but with a thick glass like that, you don’t really want to be looking at it at an off-axis angle, because it creates distortions.

I look at that and I’m suspicious that this is all really meant as a cosmetic model more than a functional model. Generally the near-eye display industry would avoid all of those things as not a good design choice. So I think the primary function was show.

Bottom line: I’m really happy that Google is promoting AR, because it is the right way to interact with digital information. You see people walking down the street looking at their hand phones, and that’s just not a very convenient, practical way. So I see this as a first step for a lot of people to start accepting the whole idea. So I’m really excited about it. It’s really going to be a major thing going forward.

Previously on GeekWire: Would you wear these Google glasses in public?

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.