Trending: Earthquake experts lay out latest outlook for the ‘Really Big One’ that’ll hit Seattle
(L to R): Urs Hölzle, senior vice president for technical infrastructure, Google, and Tom Krazit, Cloud & Enterprise Editor, GeekWire, discuss Google’s cloud business at Structure 2017 (Photo courtesy of Akshay Bhargava)

At Structure 2015, Google’s Urs Hölzle predicted that Google would get as much revenue from cloud computing as it did from its formidable advertising business by 2020. Two years later, it’s pretty clear that deadline is going to fly by, but he believes that goal is still attainable.

“Even though we’re growing much faster than (the ad business), that makes it harder to catch up when they are doing $10 billion a quarter,” said Hölzle, senior vice president for technical infrastructure at Google and one of the key people responsible for building Google’s world-class computing infrastructure a decade ago. “I think I was a little optimistic with 2020, but I don’t think the endpoint has changed.”

Google trails Amazon Web Services and Microsoft in the cloud infrastructure market by a sizable margin, after those comapnies made cloud computing an integral part of their corporate strategy far more aggressively than Google as the market developed. No one has ever questioned Google’s technical competence in building cutting-edge computing infrastructure, but it has a reputation as less business-friendly than its rivals, which is what led Google to bring in VMware co-founder Diane Greene as its cloud business leader two years ago.

Urs Hölzle, senior vice president of technical infrastructure, Google (World Wildlife Foundation Photo)

There’s still plenty of time for Google to make a dent. “From what i understand we’re the fastest growing cloud by revenue and by usage,” Hölzle said. “I think Microsoft has a much better marketing effort than we do right now, but I think we’re very competitive with the product.”

Hölzle and I discussed a few of the bets that Google has made in hopes of leapfrogging the competition. One of them is a topic we’ve talked about a lot this year: the Kubernetes container-orchestration product. Originally developed within Google, Kubernetes was released as an open-source project in 2015 and has quickly become a de facto standard for managing large numbers of containers across multiple operating environments.

However, another open-source project closely linked to Kubernetes might actually make more of an impact over time, he said.

“We’re really doubling down with Istio, the service management layer that’s built on top of Kubernetes, which may actually turn out to be more important than Kubernetes itself,” Hölzle said. Istio is a service mesh that helps measure traffic as it moves across Kubernetes clusters, and it’s just starting to evolve after it was first released in May as a joint project between Google, IBM, and Lyft.

The way that Kubernetes has become a “standard” without a formal top-down standards body making the decisions is probably the way that cloud computing standards will evolve, he said.

“It’s much easier to find agreement through code then to find agreement by editing a document and having discussions, and ultimately it’s much faster,” Hölzle said.

The other major bet that Google has made is that artificial intelligence services delivered through cloud vendors will attract customers interested in adding world-class AI into their products without having to spend millions building out a AI team. This belief isn’t unique to Google, of course, and it’s changing the way that hardware is provisioned, which has interesting implications for chip companies like Intel and Nvidia.

“The dirty secret behind (AI) is that they require an insane number of computations to just actually train the network.” Google first started applying artificial intelligence to applications with the release of voice recognition in Android, and if it had tried to process that workload with traditional CPUs “we would have had to double the entire footprint of Google — data centers and servers – just to do three minutes or two minutes of speech recognition per Android user per day.”

Obviously, that didn’t happen, thanks to the advent of GPUs from companies like Nvidia and homegrown designs like Google’s TPU chip. We’re not sure how well those systems will scale as AI grows more prevalent, but the AI revolution would have been impossible without those chips, he said.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline


Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.