Google CEO Sundar Pichai speaks at Google I/O 2017
Google CEO Sundar Pichai speaks at Google I/O 2017. (Screenshot)

Google is “rethinking our computational architecture again,” according to CEO Sundar Pichai, who rolled out the next generation of Google’s specialized chips for machine-learning research Wednesday.

The new chips are called Cloud TPUs (tensor processing units), and they are available for Google Cloud Platform customers right away, Pichai said on stage at the Shoreline Amphitheater to kick off Google I/O. Google has built arrays of these TPUs called “TPU pods,” which contain 64 TPUs and are capable of delivering 11.5 petaflops of processing power.

Google Cloud TPU Pod
A Google Cloud TPU pod. (Courtesy: Google)

“One of our new large-scale translation models used to take a full day to train on 32 of the best commercially-available GPUs—now it trains to the same accuracy in an afternoon using just one eighth of a TPU pod,” wrote two Google legends — AI research fellow Jeff Dean and infrastructure czar Urs Hölzle — in a blog post announcing the new chips.

The “commercially-available GPUs” phrase is a mild shot at Nvidia, which has grabbed the early lead in the small but important market for advanced machine learning research. But Google has been designing its own chips for artificial intelligence and machine learning applications for several years, and while it still plans to offer Google Cloud Platform customers the option of running workloads on Nvidia’s chips (according to The Next Platform), it’s pretty clear which chip design it prefers.

A Google Cloud TPU board. (Courtesy: Google)

Google’s “commercially available” claim also carefully sidesteps Nvidia’s new Volta architecture, which was announced last week and will be available later this year. Nvidia’s chips have been used by all the major cloud providers — including Amazon Web Services and Microsoft Azure — to allow customers to conduct machine-learning research without having to set up their own servers.

Google’s focus on its TPUs is “an important advance in our technical infrastructure for the cloud era,” Pichai said while revealing the new chips. And it’s a cultural shift as well: Google has always been at the cutting edge of technical infrastructure design, but it’s tended to keep those advances under wraps for its own use. Google Cloud Platform customers will be able to kick the tires on these chips through an alpha program launched Wednesday.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.