Google built its own processor to power machine learning

Machine learning is an integral part of Google’s strategy, powering many of the company’s top applications. It’s so important, in fact, that the search giant has been working behind closed doors for years developing its own custom solution to power its experiences. The result of that hard work is something Google calls a Tensor Processing Unit (TPU), a custom application-specific integrated circuit (ASIC) that works with TensorFlow, Google’s second generation machine learning system. Norm Jouppi, a distinguished hardware engineer at Google, said they’ve been running TPUs inside their data centers for more than a year now and have found them to deliver an order of magnitude better-optimized performance per watt when handling machine learning. Performance, he noted, is roughly equivalent to fast-forwarding technology about seven years into the future (or…

Link to Full Article: Google built its own processor to power machine learning

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!