Earlier this month, in advance of the July 4 holiday, Microsoft and Nokia released a collaborative experience for Xbox Live called Nokia Music Mix Party — letting different people in a room use their smartphones to control which track is playing at any given moment via the television screen. It was a bit of a novelty — complete with its own goofy video — but there was some serious technology making it happen behind the scenes, and Microsoft’s IE team is betting it’s a sign of more to come.
Microsoft this morning outlined its vision for what it calls the “companion web,” a plan to link Internet Explorer on tablets, TV screens and other large screens with smartphones. The company points many of us — more than 80 percent of Americans, by one estimate — are already using our smartphones and other devices when we’re watching TV anyway.
The idea is to link devices to the larger screen in a “cohesive, unified experience,” said Bryan Saftler, an Internet Explorer senior product manager, describing the vision in an interview.
“The one thing that is ubiquitous across all of this is the web,” he said. “We’re starting to see this bridge across devices.”
The company is aiming to use this “open web” approach to further differentiate itself in the living room from Apple, Google and other rivals that have taken the lead in key consumer markets. An update for the Xbox 360 last year added a version of IE to Xbox Live.
Microsoft has already been headed in this general direction with its Xbox SmartGlass apps, connecting Windows Phone, iPhone and Android to Xbox Live to control the console and interact with content related to what’s playing on the larger screen. The IE team’s plan is to make it easy for web developers to create these companion experiences for their own sites, leveraging web technologies.
As the latest example, Microsoft this morning is pointing to a new online experience from Polar that lets people connect their smartphones to their browser running on a tablet or TV screen, and then use the device in their hand to vote on the screen. The connection between phone and the larger screen is created over the Internet after the user snaps a QR code.
Other examples include Daily Burn, which lets people use their smartphones to control a workout experience on the television.
So how does this actually work? Here’s the explanation via email from Erik Klimczak of Clarity Consulting, the lead developer of the Nokia Music Mix Party experience.
One of the key aspects of companion sites is the “handshake” between the device and the primary screen. The goal is to be as seamless as possible. At first glance it even seems a bit magical. First, a user hits a website on a big screen (in our case Nokia Music on the Xbox) then snaps a QR code with any mobile device. Now he can control the big screen with the small screen. In other words, two seemingly unconnected, unrelated devices can talk to each via simple image capture.
The “handshake” works is mostly through WebSockets / Long Polling and Node. For Nokia Music we used a messaging platform called FAYE that works with both WebSockets and long polling. The long polling approach is used for browsers that don’t support web sockets, like IE for Xbox. When the user snaps a QR code it contains a URL and ID specific to the big screen. When the node server detects the client it creates a “session” between the big screen and the small screen. Then we use the messaging platform to send information back and forth between the devices in near real-time.
The next step, says IE’s Saftler, is to get feedback from a broad set of web developers, and start to get the technologies into their hands to build their own experiences that link small and big screens. The company hopes this will become a standard part of web development, just as developers have embraced responsive design to adapt sites to screen sizes on the fly.