Dell, EMC, Dell Technologies, Cisco,

Saturday, November 25, 2017

Google Cloud Platform cuts the price of GPUs by up to 36 percent

#Google today announced that it’s cutting the price of using #Nvidia’s #Tesla #GPU s through its #ComputeEngine by up to 36 percent. In U.S. regions, using the somewhat older K80 GPUs will now cost $0.45 per hour while using the newer and more powerful P100 machines will cost $1.46 per hour (all with per-second billing). The company is also dropping the prices for preemptible local SSDs by almost 40 percent. “Preemptible local SSDs” refers to local SSDs attached to @Google ’s preemptible VMs. You can’t attach GPUs to preemptible instances, though, so this is a nice little bonus announcement — but it isn’t going to directly benefit GPU users. As for the new GPU pricing, it’s clear that Google is aiming this feature at developers who want to run their own machine learning workloads on its cloud, though there also are a number of other applications — including physical simulations and molecular modeling — that greatly benefit from the hundreds of cores that are now available on these GPUs. The P100, which is officially still in beta on the Google Cloud Platform, features 3594 cores, for example. Developers can attach up to four P100 and eight K80 dies to each instance. Like regular VMs, GPU users will also receive sustained-use discounts, though most users probably don’t keep their GPUs running for a full month. It’s hard not to see this announcement in the light of AWS’s upcoming annual developer conference, which will take over most of Las Vegas’s hotel conference space next week. AWS is expected to make a number of AI and machine learning announcements, and chances are we’ll see some price cuts from AWS, too.

https://techcrunch.com/2017/11/20/google-cloud-platform-cuts-the-price-of-gpus-by-up-to-36-percent/

No comments:

Post a Comment