ESPN technicians and NFL game staff chat about new camera equipment embedded in the first down marker and sideline pylons before a Monday Night Football game Dec. 2 in Seattle. (GeekWire Photo / Taylor Soper)

Technology is changing the way we watch live sports. The innovation dates back decades, from the advent of the color TV, to the yellow augmented reality first down line, to high-def TV, to internet-powered streaming options.

With software and hardware evolving at a rapid pace, what’s next for the industry? How will technologies such as artificial intelligence and machine learning affect live sports production? And when will the robots take over — if ever?

We posed some of those questions to ESPN executives who help put together Monday Night Football (MNF) each week during the NFL season. Earlier this month the crew was in Seattle, producing and airing the Seahawks’ thrilling 37-30 win over Minnesota on Dec. 2.

Jim Munn, senior operations specialist for ESPN, stands outside a Monday Night Football truck at CenturyLink Field in Seattle before the Vikings vs. Seahawks game on Dec. 2. (GeekWire Photo / Taylor Soper)

The amount of work and complexity that goes into a professional 3-hour game production is fascinating and easy to overlook for fans enjoying the polished broadcast from the couch. ESPN is constantly looking at ways to enhance the viewing experience, whether it’s this year’s MNF graphics hailed as “absurd” and beautiful” by The Ringer, or devices that provide new perspectives (some of which aren’t always initially welcomed).

New tools rolled out this season for MNF include the Line-To-Gain cameras and the Marker Cam. These are remote-controlled cameras embedded inside both a ground-level pylon at the orange and black first down marker, and another inside the bullseye atop the actual marker itself. The wide-angle cameras provide an additional perspective for both fans and referees — for example, showing if a player was out of bounds, or if a pass interference penalty was committed.

“Technology has allowed us to bring the viewer closer to the field, which is the thing that people at home crave more than anything,” said Jimmy Platt, who took over this year as Monday Night Football director.

Jim Munn, a senior operations specialist with ESPN, said the new cameras provide multiple benefits. “They are really beneficial for production and the game itself,” he said.

There’s also an advantage of using embedded cameras because they don’t clutter the sidelines with people.

That begs the question — will humans be eventually removed from the live game production process, as more automation and better camera tech becomes available?

“We are not losing any workers because they shift from running a regular camera to manning the joystick on the virtual camera,” Munn noted.

For a high-budget production like Monday Night Football, which averaged 12.6 million viewers this season, there’s too much nuance that requires a human touch — at least for now, Platt said.

But ESPN and other broadcast companies can’t send multiple trucks and a staff of workers to every sporting match across the world. That’s where automated production can help.

The advent of live streaming is opening up a new world of possibilities for live sports consumption. Paired with computer vision, machine learning, robotics, and other technologies, a future where more games are filmed with little or no human interaction seems plausible.

A talk at the Sloan Sports Analytics Conference from 2014 detailed how robotic pan-tilt-zoom cameras run can automatically with player tracking technology.

ESPN is working with an Israel-based company called Pixellot to broadcast college sports and NBA Summer League action, as Sports Video Group reported earlier this year. Pixellot uses automated cameras and has tech that can pull stats from a scoreboard and integrate them into the live stream.

“The combination of computer vision, AI, and deep learning provides, for the first time, a viewing experience that is comparable to a human camera operator and, in some respects, supersedes the abilities of a single camera operator,” Pixellot co-founder Gal Oz wrote last year. “These algorithms will get better and will be able to cover more types of sporting event.”

This video shows clips from games filmed by Pixellot’s robotic cameras.

Munn said that automated cameras with artificial intelligence are reliable for patterns — like knowing when the ball is on the field, for example — but it can’t predict the unknown. “Training that is very difficult,” he said.

But he added: “We have a lot of content where AI would be perfect. It gets us coverage where we’ve never had it before, and that serves all of our fans.”

Platt said he doesn’t expect robots to replace the work that producers and directors do for big-time games with millions of viewers. But he said anything is possible, given how much live sports broadcasts have evolved in recent years.

“I don’t doubt any of it,” he said. “But I think there’s a feel component to what we do that doesn’t exist when a computer does it.”

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.