Customers of Amazon Web Services looking for easier ways to get their deep-learning models into production will be able to take advantage of a new service based around Docker containers.
Matt Wood, general manager of deep learning and AI for AWS, announced the AWS Deep Learning Containers Wednesday during the AWS Summit in Santa Clara, Calif. The idea is to take some of the complexity of actually deploying complicated deep-learning models onto cloud services using familiar tools like the widely used Docker container format, Wood said.
Customers that choose this option will be able to “do less of the undifferentiated heavy lifting of installing these complicated frameworks,” Wood said. AWS Deep Learning Containers supports the TensorFlow and MXNet frameworks at launch, and support for Pytorch will be added soon, he said.
Anyone following cloud computing over the last few years has heard a lot about artificial intelligence and machine learning, and deep learning is a more sophisticated and complicated type of machine learning that requires a lot of computing power. However, the people who are experts in this field generally aren’t experts in the also-complicated process of making workloads run reliably at scale on computing resources, which is where AWS Deep Learning Containers comes in, allowing them to fall back on a de facto industry standard to get the job done.
AWS Deep Learning Containers can run on Amazon ECS (a managed Docker container service) or EKS (a managed Kubernetes service), and the containers themselves are free.
Around 6,500 people attended the AWS Summit in Santa Clara, one of several local events the cloud market share leader puts on around the planet each year. AWS tends to save its big announcements for the annual re:Invent conference in Las Vegas, and it announced Wednesday that several services first unveiled at that event last November — including AWS App Mesh and F5’s Cloud Services — are now generally available.