Google Cloud AI Platform updates make it ‘faster and more flexible’

Source: artificialintelligence-news.com

Google has issued several updates for its Cloud AI Platform which aims to make it ‘faster and more flexible’ for running machine learning workloads.

Cloud AI Platform is Google’s machine learning platform-as-a-service (ML PaaS) designed for AI developers, engineers, and data scientists. The platform is end-to-end and supports the full development cycle from preparing data, to training, all the way to building and deploying machine learning models.

Among the most noteworthy additions to the platform is support for Nvidia GPUs. As Google explains, “ML models are so complex that they only run with acceptable latency on machines with many CPUs, or with accelerators like NVIDIA GPUs. This is especially true of models processing unstructured data like images, video, or text.”

Previously, Cloud AI Platform only supported one vCPU and 2GB of RAM. You can now add GPUs, like the inference-optimised, low latency NVIDIA T4, for AI Platform Prediction. The basic tier adds support for up to four vCPUs.

AI Platform Prediction is being used by Conservation International, a Washington.-based organisation with the mission “to responsibly and sustainably care for nature, our global biodiversity, for the wellbeing of humanity,” for a collaborative project called Wildlife Insights.

“Wildlife Insights will turn millions of wildlife images into critical data points that help us better understand, protect and save wildlife populations around the world,” explains Eric H. Fegraus, Senior Director, Conservation Technology.

“Google Cloud’s AI Platform helps us reliably serve machine learning models and easily integrate their predictions with our application. Fast predictions, in a responsive and scalable GPU hardware environment, are critical for our user experience.”

Support for running custom containers in which to train models has also become generally available. Users can supply their own Docker images with an ML framework preinstalled to run on AI Platform. Developers can test container images locally before they’re deployed to the cloud.

Customers aiming to use the platform for inference – hosting a trained model that responds with predictions – can now do so. Machine learning models can be hosted using the Google Cloud AI Platform and AI Platform Prediction can be used to infer target values for obtaining new data.

Oh, and AI Platform Prediction is now built on Kubernetes which enabled Google to “build a reliable and fast serving system with all the flexibility that machine learning demands.”

Related Posts

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence