An Ikea warehouse in Brooklyn. (Wikimedia Photo / public domain)

Google’s history of machine-learning research increasingly informs its cloud product development strategy, and it introduced a preview of machine-learning capabilities for its BigQuery data warehouse Wednesday at Google Cloud Next 2018.

BigQuery customers will be able to train machine-learning models within the data warehouse through BigQueryML, which Google executives plan to demonstrate on stage at the Moscone Center in San Francisco on the second day of Google’s big cloud event. The idea is to give users a way to use machine learning on big data sets that doesn’t require shuffling that data back and forth between BigQuery and the system used to train that data.

Data warehouses are specialized types of databases that are optimized for analytical queries, prioritizing reading the database over writing to it. All major cloud vendors offer a service of this type, and Snowflake Computing, led by ex-Microsoft executive Bob Muglia, has raised a ton of money to build out its own cloud data warehouse.

On the data front, Google also plans to announce that BigQuery users will soon be able to create clusters for like-minded data sets to improve performance, and add geospatial data to their data warehouses with help from the Google Maps team.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.