Google I/O: Custom TPU chip amplifies machine learning performance

Google on Wednesday revealed that for the past year, it’s been powering data centers with a custom-built Tensor Processing Unit (TPU) chip designed for machine learning and tailored for TensorFlow. Google started working on the chip because “for machine learning, the scale at which we need to do computing is incredible,” Google CEO Sundar Pichai said in the keynote address at the Google I/O conference. The result, the company announced in a blog post, is that they “deliver an order of magnitude better-optimized performance per watt for machine learning. This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law).” Because the chip is tailored for machine learning, it’s more tolerant of reduced computational precision and requires fewer transistors per operation. “We’re innovating…

Link to Full Article: Google I/O: Custom TPU chip amplifies machine learning performance

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!