Dell, EMC, Dell Technologies, Cisco,

Wednesday, April 5, 2017

Building an AI Chip Saved Google From Building a Dozen New Data Centers

#GOOGLE OPERATES WHAT is surely the largest computer network on Earth, a system that comprises custom-built, warehouse-sized data centers spanning 15 locations in four continents. But about six years ago, as the company embraced a new form of voice recognition on Android phones, its engineers worried that this network wasn’t nearly big enough. If each of the world’s Android phones used the new Google voice search for just three minutes a day, these engineers realized, the company would need twice as many data centers. At that time, Google was just beginning to drive its voice recognition services with deep neural networks, complex mathematical systems that can learn particular tasks by analyzing vast amounts of data. In recent years, this form of machine learning has rapidly reinvented not just voice recognition, but image recognition, machine translation, internet search, and more. In moving to this method, Google saw error rates drop a good 25 percent. But the shift required a lot of extra horsepower. Rather than double its data center footprint, Google instead built its own computer chip specifically for running deep neural networks, called the Tensor Processing Unit, or TPU. “It makes sense to have a solution there that is much more energy efficient,” says Norm Jouppi, one of the more than 70 engineers who worked on the chip. In fact, the TPU outperforms standard processors by 30 to 80 times in the TOPS/Watt measure, a metric of efficiency.

https://www.wired.com/2017/04/building-ai-chip-saved-google-building-dozen-new-data-centers/

No comments:

Post a Comment